A world where any woman could end up in a porn film without her consent
Deepfake technology means we're already there, says feminist campaigner Jess Davies, the subject of this month's The Backlash Q&A.
Last week, British reality TV star Stephen Bear was sentenced to 21 months for sharing a video of himself and a former girlfriend having sex. Bear filmed Georgia Harrison, also a reality star, without her knowledge on CCTV cameras in his garden, put the video on OnlyFans, and shared it on WhatsApp. On top of the prison term, he got a five-year restraining order and was told to sign the sex offenders register.
The sentence sent a message, at least, that sharing private, sexual images without consent amounts to abuse. "I want to let all other victims of this crime know that I stand in solidarity with them,” said Harrison, who waived her anonymity and was commended by the Crown Prosecution Service for showing “bravery and determination….throughout the case”.
It is worrying enough that intimate footage could be shared without your consent. But what about the dystopian possibility of your likeness being used in pornography that is a complete fabrication — and without your knowledge? Advances in deepfake technology mean that this is already our reality, says Jess Davies, the subject of this month’s The Backlash Q&A. The presenter, feminist campaigner, and ex-model investigated deepfakes, which overwhelmingly target women and girls, in a BBC documentary last year. Her film revealed just how easy it is to make deepfake porn and how powerless current legislation is to stop it. I asked her more about this disturbing trend.
The Backlash: You have made two films for the BBC about online sexual assault. One was about the online trade in stolen nude photos, which was based on your own experience, and the other more recent one was about deepfake porn that is made surreptitiously without the consent of the women whose images are used. Some of these women are celebrities, some are politicians, and some are ordinary people with no public profile. You paint a pretty scary picture, opening the film with this chilling line: “Imagine a world where any woman could end up in a porn film without their consent. We're in this world now.” How widespread is this? And how worried should we be?
Jess Davies: Deepfake porn is a growing problem and disproportionately targets women. Research shows that around 96 per cent of deepfake content online is of a pornographic nature, which is in stark contrast to the traditional ways we might view or see deepfake technology being used in the media, for example of actors and politicians on social media websites. In the documentary, we found that the most popular deepfake porn website was receiving millions of visits per month, so the demand for this non-consensual content is huge.
TB: What did you uncover in your investigation that most shocked you?
JD: The thing that surprised me most was seeing requests being posted by men for deepfake porn content of their friends and family members. There was such a huge lack of empathy and total disregard for the women they were desperately seeking pornographic content of, without paying any mind to the lack of consent around this technology, or the trauma it may cause to the women being placed in these videos. To see so many men place their sexual desire over these women — some of them their family members — without their consent was a terrifying thing to watch unfold.
TB: It seems astonishing that there is still no legislation to deal with this problem. In the UK, the Online Safety Bill, which is still going through parliament, will criminalise the sharing of deepfakes, but that has yet to pass. Why is action on this so slow?
JD: I think unfortunately our legal system is not designed to keep up with the fast-paced environment of the digital world. The Online Safety Bill was first discussed in Parliament in 2019 and it is already out of date in certain areas because technology and online harms are developing at such a speed. We’re also facing a group of people, many of them older and many of them men, who simply do not understand how the digital world works, especially for women. This makes it a lot more difficult to try and emphasise to them just how big a problem violence against women and girls is online. We’ve also seen the focus shift because of culture wars around “wokeism” and free speech, so that the Online Safety Bill is in danger of being watered down and failing to make much difference at all when it comes to protecting women and girls online.
TB: In the documentary, you uncovered a lot of posts sharing or discussing deepfakes that contained expressions of extreme misogynistic, sexual violence. As you say in your programme, it seems that for many people consuming or creating these deepfakes there is a desire to humiliate women and girls, to make them vulnerable. Is this a function of misogyny? And do you see this trend as related to the backlash against feminism?
JD: Absolutely this is connected to misogyny. These individuals creating and consuming the content have a total disregard for the women in the images, they are swapping their heads like Barbie dolls for their own sexual pleasure, because they simply do not see women as worthy of their respect. The desire is split between sexual desire and more of an agenda to blackmail or shame the women they are requesting deepfakes of. I also saw men sharing girls’ instagram handles and encouraging others to send deepfake porn of these women to them, then screenshot their reply and share it with the men in the forum. It was a sick game for them to see how these women would respond.
TB: As you also document, female politicians are often the target of deepfakes, which takes the online abuse and gender trolling female leaders face to another extreme. Is there a risk to women entering politics, or public leadership positions, because of the increased sophistication of the attacks they can come up against?
TB: Unfortunately female politicians face much more online abuse than their male counterparts and this can often take shape through image-based sexual abuse. It is used by these men as a way to silence powerful and confident women and assert control over them. They’re hoping to bully them into silence and to stay in their lane; it’s all about control and shame. This can absolutely have a negative effect when it comes to trying to encourage women into politics, because why would you want to put yourself or your family through that? But then the perpetrator’s tactics would win and we would see fewer women in politics and in successful positions. It’s almost like we have to accept that they are going to use image-based sexual abuse to target us until something serious is done about protecting women online.
TB: How can women and girls protect themselves from being targeted?
JD: Unfortunately there is no real way to protect ourselves from this technology, which is a terrifying thought. The advice some give is just to not share any images or videos of yourself online, but in a world that sees us using the internet for everything from ordering groceries to taking online classes, this advice is simply unrealistic. It’s also seen as victim blaming because why should we delete our social media altogether because there are others out there abusing us? It shouldn’t be on us to hide ourselves away. But if you do find yourself in a deepfake porn video or image without your consent, you can report it to the social media site and also contact help services such as the Revenge Porn Helpline in the UK and CEARTAS, a DMCA takedown service, which will seek to remove revenge porn content for free. I would also advise reporting it to the police, as this could be classed as blackmail or harassment, and will help the police be made aware of how common these type of attacks are.
Jess’ documentaries When Nudes are Stolen and Deepfake Porn: Could you be Next are on BBC iplayer. You can follow her on Twitter, instagram and TikTok
This is a horrifying topic but an important one. Thanks for doing a great interview 👍