Her inbox was flooded with explicit screenshots, seemingly from a pornographic video. She recognized her face, but not her body. For just a second, she questioned whether the footage could be real. It certainly looked that way, though she knew she had never filmed herself nude.
It was a deepfake – so realistic that even the woman featured in it was momentarily fooled.
QTCinderella is a fan-favorite Twitch streamer who is known for wholesome gaming and baking content. But late last month, on Jan. 30, a fellow streamer briefly showed a browser window that featured a website that creates AI-generated explicit content of women, including female streamers. On the site, deepfake porn of the 28-year-old could be found, and since then, she says her name, her face and her brand have become associated with pornography.
Overwhelmed with shock, confusion, panic and pain, she decided to share her feelings in an impromptu livestream to her 800,000 followers.
This, she says, is what sexual trauma looks like.
"I wanted to show this is a big deal," says QT, who asked we refer to her using her username for privacy reasons. "That every single woman on that website, this is how they feel. Stare at me sobbing, and tell me you still think this is OK."
Deepfake porn lives on in screenshots
Deepfake, or videos that use artificial intelligence to combine images or videos onto a source material, is not new. The process can be used to make it look as if people said, or did, things they did not. Experts worry that the technology can do more harm than good – especially for women in the public eye.
What is a deepfake? This video technology is spooking some politicians
Platforms like Reddit have banned deepfake porn, but smaller sites, like the one that shared fake images of QT, still exist. And even if a person succeeds in getting an explicit video removed, as QT was able do, screenshots continue to circulate.
Along with the harassment, stalking and misogyny that female streamers often grapple with online, this modern form of sexualization, QT says, made her feel exploited and "purely like an object."
The porn may be fake. But the trauma isn't.
Though fans and other women in the public eye are voicing their support, QT says she has also received an influx of hate and victim-blaming messages, most of them from men who don't understand how fake images can cause real harm. Others believe this is the price women pay for internet fame.
"This is nothing I've done. I haven't done anything wrong. That's what's crazy about all this: We (as women) have done nothing wrong. We just existed," QT says.
She has also struggled with the pain of having these pictures sent to her family – the discomfort of having to explain the photos, over and over. It's a humiliating conversation she never thought she'd have to have.
Contrary to popular belief, licensed clinical social worker Jessica Klein says, an image, altered or not, is enough to create real, tangible trauma, and for some, diagnosable PTSD. Research has supported that the mental health effects of sexual assault and image-based, nonconsensual abuse (like revenge porn) are similar.
"Something doesn't need to physically happen to your body to be traumatizing," says Klein, who works with victims of revenge porn. "It's a violation. A sense of helplessness, fear and shame. Your sense of safety is completely annihilated when your body is being portrayed in this nonconsensual way for millions to view."
For QT, the objectification of her body – against her will – was all too familiar. It triggered memories of her sexual assault experience.
"Minutes after I saw that photo, I felt the same way," she says. "The same feeling of guilt, with the same feeling of being used. And it's because it's another thing I didn't agree to. Another thing I didn't want to do. Another version of me I didn't want seen or touched or looked at."
Maya Higa, a fellow streamer who creates conservationist content and was also a victim of deepfake pornography, shared the same sentiment.
“Today, I have been used by hundreds of men for sexual gratification without my consent. The world calls my 2018 experience rape. The world is debating over the validity of my experience today," Higa wrote in a Twitter statement on Jan. 31.
'I'm a normal girl'
QTCinderella had a lot to lose when deciding whether she wanted to publicly speak about the incident. She wants to move past the controversy.
But the reason she is doing this is "for the women that really can't afford to have this on there."
"We need federal laws," QT says. Experts like Klein agree: While many U.S. states have laws against "revenge porn" and nonconsensual nude images, only three states (California, Virginia and Texas) specifically include deepfakes.
"We need something to happen to people that take advantage of others," QT says. "That's fundamentally a change that hopefully we can all agree with, and if you can't see it that way, I beg every person to imagine seeing these types of photos or videos someone they care about, against their will."
Beyond this controversy and their bodies, women like QT deserve to be known for their humanity. For instance, her contributions in the Twitch space: She spearheaded the annual Streamer Awards, an opportunity to bring gamers together and celebrate one another as a community. In her free time, she raises money for Alveus Sanctuary for animals.
"I'm a normal girl," she says. "I like Taylor Swift. I like baking cookies. I like going to Disneyland."
If you are a survivor of sexual assault, RAINN offers support through the National Sexual Assault Hotline (800.656.HOPE & online.rainn.org).
This article originally appeared on USA TODAY: QTCinderella, deepfake porn and the trauma it causes