The Bark

View Original

Is AI Stealing Your Face?

Homecoming week just ended, and the fall leaves look beautiful, so you took pictures the whole time. You pose for a picture to send to your parents, and yeah, maybe you'll post that one too. Your phone lights up with notifications from your closest friends to people you're not sure you've even met. What else could people do with your simple Instagram post besides like, comment, or share if  "they really love you?" What about screenshotting? If it's your mom reposting it to Facebook, it's not weird, but what if it's that random guy in the hallway? The "harmless" guy you rejected but didn't block? While the idea of your picture in someone's camera roll is unsettling, the horrifying part is what they can actually do with it.



 You may have been created into deepfakes, AI-generated images or videos manipulated to appear real. These images, often generated from prompts, are created by combining other photographs. Many deepfakes are memes, mostly about politicians, posted online and laughed at. 



However, deepfakes can hurt people in ways that aren't humorous. They have been used to create nonconsensual deepfake pornography of real people. An article by Vox states that "96 percent of all deepfake videos were pornographic and nonconsensual." Deepfakes are not a brand-new issue, especially for female celebrities. However, with the advancements of AI, this can happen with a few photographs of anyone: an ex-significant other, a classmate, or someone who follows you on social media. There are thousands of videos of nonconsensual deepfakes on the internet, from the most popular porn sites to websites you can pay for custom requests. 

Photo from denstoredanske.lex.dk

When I first learned about deepfakes, I thought of someone who would do that to me. After talking to my friends, we all came to the same conclusion: this could happen to anyone, no matter who they are. Can you imagine looking yourself up online and finding that someone had stolen your face for their pleasure? Noelle Martin, at 17, found deepfake pornography of herself after curiously googling her name like many of us do. She was rightfully mortified and unsuccessfully tried for years to get the photos and videos taken down. Martin found that the more she spoke out, the more she encountered pushback. She faced victim blaming and claims that the 'content' wouldn't have been made if she had dressed differently. Ten years later, now 28 and a lawyer, Martin has still not received freedom from her deepfakes. "You cannot win. This is something that is always going to be out there. It's just like it's forever ruined you” (AP New, Haleluya Hadero). 


 Popular Twitch streamer, QTCinderella, found deepfakes of herself on a paid-for porn site. The website was exposed when a friend and fellow streamer accidentally revealed that he had been paying for it. QTCinderella claims that since those videos existed, "her name, her face, and her brand have become associated with pornography” (USA Today, Jenna Ryu).  Like Martin, QTCinderella has faced victim-blaming and incoming hate. Another streamer, Maya Higa, also featured on the website and friends with the male streamer, wrote on Twitter, "In 2018, I was inhibited at a party and used for a man's sexual gratification without my consent. Today, I have been used by hundreds of men for sexual gratification without my consent. The world calls my 2018 experience rape. The world is debating over the validity of my experience today."


Some claim that deepfakes can't be damaging because they are not real. However, blogger, Ashish Jaiman would disagree.  "Pornographic deepfakes can threaten, intimidate and inflict psychological harm...Deepfake porn reduces women to sexual objects, torments them, causing emotional distress, reputation harm, abuse, and...material harm like financial loss," Jaiman stated in his 2020 article “Deepfakes Harms & Threat Modeling” Even though the content is fake, the pain is genuine. Deepfakes also drive women to stop using social media, mainly stopping them from posting images of themselves. While that may be preventive, it feeds into victim blaming.


Although no legislation prevents or punishes people from making deepfakes in America, people are trying to change that. According to Vogue, there is a UK campaign called #MyImageMyChoice and a petition for the government to construct "world-leading intimate image abuse laws." As awareness of this dark issue spreads, the chances of laws thwarting deepfake nonconsensual pornography increases. Remember, deepfakes can be made of anyone, no matter how famous or average, created from pictures of Taylor Swift performing at the largest stadiums in the world or a regular college student at their homecoming week. What is stopping one from being made of you?