Teen Marvel star Xochitl Gomez: Twitter wont remove explicit deepfakes of me

Just when you thought Twitter couldnt get any lower, it finds new ways to sink further down. Doctor Strange and the Multiverse of Madness Xochitl Gomez is 17-years-old. Earlier this month, Gomez appeared on an episode of Taylor and Taylor Lautners podcast, The Squeeze. During the episode, Gomez brought up how she found nonconsensual, sexually


Just when you thought Twitter couldn’t get any lower, it finds new ways to sink further down. Doctor Strange and the Multiverse of Madness Xochitl Gomez is 17-years-old. Earlier this month, Gomez appeared on an episode of Taylor and Taylor Lautner’s podcast, The Squeeze. During the episode, Gomez brought up how she found nonconsensual, sexually explicit deepfake images of her on the platform. That’s horrible on its own, but it gets worse. When Gomez’s team reached out to have it taken down, they were unable to get Twitter Xchan to remove them. Again, Xochitl is only 17!

“It made me weirded out and I didn’t like it and I wanted it taken down. That was my main thought process was, ‘Down. Take this down. Please,’” Gomez said during the podcast. “It wasn’t because I felt like it was invading my privacy, more just like it wasn’t a good look for me. This has nothing to do with me. And yet it’s on here with my face.”

In a search Friday, NBC News was able to easily find several deepfakes of Gomez on X. A representative for X did not immediately respond to a request for comment.

NBC News reported in June 2023 that nonconsensual deepfakes of young female social media stars were circulating on X despite the platform’s rules against nonconsensual nudity. After reaching out to X, some but not all of the material was removed.

Gomez joins a chorus of girls and women, some famous and some not, who have spoken up about the growing crisis of nonconsensual sexually explicit deepfakes. Typically, these deepfakes use artificial intelligence to graft the victim’s face into a pornographic image or video. Search engines like Google and Microsoft’s Bing host such material in top image search results for prominent women’s names plus the word “deepfakes,” while links to websites that monetize the material appear in top web results. Both Google and Microsoft’s Bing have search results takedown request forms for nonconsensual deepfake victims and their representatives.

“It’s just weird to think if someone looked up my name, that’s also what pops up, too,” Gomez said during the podcast. “You can’t take them down.”

There is currently no federal legislation in the U.S. addressing nonconsensual sexually explicit deepfakes and only a patchwork of state laws pertaining to deepfakes, but a federal bill that would criminalize the nonconsensual sharing of the material is awaiting further action.

“Why is it so hard to take down? That was my whole thought on it, was, ‘Why is this allowed?’” Gomez said. “In my mind, I knew that it wasn’t me, so it didn’t mess with me or anything like that. It was just something that felt really uncomfortable because I couldn’t take it down.”

“Nothing good comes from thinking about it,” she added. “I put my phone down […] I do some skincare, I go hang out with my friends, something that will help me forget what I just saw.”

[From Yahoo]

Holy f–-k, this is horrible. Xochitl is way calmer than I’d be about this. Women used to have to worry about revenge porn and now we have to worry about people just straight up creating sexually explicit deepfake images of us. We need to make a lot of noise to draw enough attention to this issue to force X’s billionaire owner to have them removed. That’s got to be the quickest option for now, right?

But seriously, why is it so hard to take down? We all know why it’s so hard to take it down from Twitter, but why is this not a bigger issue for legislatures? That federal legislation that would criminalize it was first introduced in the House of Representatives last May and has stalled. One of the teenage girls who spoke out after it was introduced said that male classmates had created explicit deepfakes of more than 30 girls. This is happening in high schools! Where are the politicians and people who are so concerned about “protecting children” from “sexually explicit content”? Even with such an unproductive House, you’d think this would be a bipartisan no-brainer.

Photos credit: Jeffrey Mayer / Avalon, Xavier Collin / Image Press Agency / Avalon

ncG1vNJzZmivp6x7pLHLnpmirJOdxm%2BvzqZmcW1jboN6e9OenKeXnZa%2Ft7HLmKqtmaKUxbCvx6KrpZeXpLqmxr6trqKspJq%2FoMPOp6uYqpWivLexvp6vqaSZmLa1q8OenKmekaCytKvOn5amnV8%3D

 Share!