'Another Body' exposes the nightmare of deepfake porn, lack of resources to stop it

"This abuse aims to silence and to shame, and it's very, very effective," co-director Sophie Compton said

  • Oops!
    Something went wrong.
    Please try again later.

While much has been debated about the benefits and destructive applications of AI, filmmakers Sophie Compton and Reuben Hamlyn's documentary Another Body (now in select theatres) shines a light on the nightmare reality of deepfake porn targeting women.

The film first introduces us to Taylor, an engineering student who gets a message from a friend with a link that reveals her face has been edited onto bodies of porn actors, with these videos posted all over Pornhub.

She's terrified and humiliated, and that's when the film starts off on its path where Taylor tries to figure out who did this to her and how to stop these videos from being created, and spreading.

When Hamlyn and Compton started making this documentary, Hamlyn was studying far-right communities on 4chan at King's College London. Shortly after that, deepfake technology began appearing on Reddit and then spreading to other online communities, like 4chan.

"There started to be some media coverage of it, but it was only talking about politics, the threat to democracy and geopolitical stability, and at that point that was totally speculative," Hamlyn told Yahoo Canada. "It completely overlooked what the real issue was, which was that this technology was being used to target women."

Hamlyn then got in contact with Compton, who had worked on a play about intimate image abuse, to be part of the project. The pair went through an extensive research phase, scouring 4chan and conducting reverse image searches, leading them to discovering Taylor's deepfakes on Pornhub and xHamster. As the filmmakers explained, Taylor's personal information was accessible, which led them to her Instagram account.

"We actually waited for like 10 days, consulted with some experts and victims, survivors about the best way to approach, whether or not we should approach, and after sort of feeling confident in how we were going to go about it, we got in contact with her," Hamlyn explained.

"[We] said, 'Hey, we found these videos, we think you should be aware of them, here are some resources to support, both psychological and legally. By the way we're filmmakers and if you feel like sharing your story, get in touch.' And she did, about a day or two later."

Another Body documentary from filmmakers Sophie Compton and Reuben Hamlyn (levelFILM)
Another Body documentary from filmmakers Sophie Compton and Reuben Hamlyn (levelFILM)

'Tech companies that will do absolutely nothing if it's not illegal content'

The big reveal in the film is that the Taylor we see isn't real, after she states that she would only feel comfortable sharing her story with a level of anonymity. The Taylor we see in the documentary is an actor's face deepfaked over hers. It's not only a choice for Taylor's safety, but it also really hits you as an audience member in terms of how real these deepfakes look, making Taylor's story even more devastating.

But in order for Taylor to really share this journey to seeking some kind of justice and candidly expressing how these videos have impacted her life, there had to be a level of trust established with the filmmakers, who also gave Taylor the space to film a lot of the footage herself. As Compton explained, it was critically important to be "transparent."

"Something that's so important when you're working with survivors is to not lose sight of the reality of their life, and I think that sounds so obvious, but it's so easy in the production process, with a goal of finishing a film in mind, to get slightly tunnel visioned," Compton said. "I think that dynamic is so important and it needs to be built on transparency, and it needs to be built on trust and it also needs to be built on both of you having a shared goal."

"We had to navigate so many things with Taylor around when to give her space, and when to step back and allow her to process what needed to be processed. And when to push and say, 'It might feel uncomfortable now, but we really encourage you, if you can, to record what you're going through right now, because now's the moment when those emotions are fresh. So it's going to be the hardest time to tell that part of the story, but it's also going to be the most impactful.'"

Compton and Hamlyn's work to not just make Taylor comfortable, but to actually be active participants in the call of more resources for women who have fallen victim to deepfake porn, is evident through the #MyImageMyChoice campaign that they started, realizing there is little awareness of digital violence against women.

"We realized that we were collecting this really important archive that was so valuable to research bodies, governments and all the people that try to design the policy work around this, who were failing to find survivor stories, which I always find a bit ridiculous because surely that should kind of be their job," Compton said. "We think that deepfakes need to be criminalized, because you need to send a clear message, this content is not acceptable, and the content needs to be illegal to enforce any action on the tech companies that will do absolutely nothing if it's not illegal content."

"I don't really know how other people manage to work without a really cool activist piece, but I think that the fact that we were building this campaign alongside making the film meant it felt like all of us ... were pieces in this like larger fight that we were all trying to move towards. ... I think it's also important for documentarians to have confidence that the film itself is the most powerful piece of activism that you could make, because it's an hour which somebody is going to immerse themselves in this story. If you tell it well, you're creating empathy, you're creating connection, you're giving so much context."

'Another Body' exposes the nightmare of deepfake porn, lack of resources to stop it (levelFILM)
'Another Body' exposes the nightmare of deepfake porn, lack of resources to stop it (levelFILM)

'They know that if they speak out that they risk greater retaliation'

In terms of the systemic issues that exist for women trying seek some sort of resolution after deepfake porn with their face appears online, Hamlyn identified that online gendered violence combines two things that police "struggle" with when it comes to new technologies. Firstly, digital crimes that police forces are "ill equipped" to deal with, with lack of training. Secondly, police "don't see the clear harm" caused by intimate image abuse, because "they don't see the physical crime."

Some of this is exhibited in Another Body when Taylor calls the police and seemingly has issues with anyone fully understanding what happened, and being unsure of how to respond.

"I think there's the entrenched misogyny that's common in police forces that also creates indifference or apathy towards the targets of this," Hamlyn said. "So when you go to a police force and say, 'AI has put my face into pornography,' they're first of all, like, 'What the hell are you talking about?' And secondly, 'What's the problem here?' Particularly when there are no laws in place, which are forcing them to act."

"It's a very unfortunate situation to find yourself in and that's why we think it's so important that there is increased education for police forces, both about the impacts of intimate image abuse on the whole, not just deepfake pornography, as well as finding out new strategies for investigating online crimes."

"This abuse aims to silence and to shame, and it's very, very effective because people don't want to draw attention to the deepfakes of them," Compton added. "They know that if they speak out that they risk greater retaliation."

Back in April, a Quebec man was sentenced to eight years in prison after producing deepfake child pornography, the first case of its kind in Canada.