27 Dec 2024

‘Deepfake porn’: how I became a victim

Presenter

I cover wars, natural disasters and personal tragedies on an almost daily basis so I thought I’d be braced for anything as I prepared to confront my own deepfake.

2024 was the year “deepfake porn” entered the lexicon. And the year I realised I – along with Taylor Swift, Jenna Ortega, Alexandra Ocasio-Cortez and Georgia Meloni – had fallen victim to it.

Graphic pictures of Taylor Swift had already been viewed some 45 million times by the time my colleagues on the Channel 4 News investigations team decided to research the extent of this new and rather disturbing phenomenon. They wanted to see how many women had been “deepfaked”, the impact of this AI-generated abuse, and what lawmakers could do to put a stop to it.

At that point none of us knew I was among the millions of victims across the world.

We identified the five most popular deepfake porn sites hosting manipulated images and videos of celebrities. These sites got almost 100 million views over three months and we discovered videos and images of around 4,000 people in the public eye. At least 250 of those were British.

We approached some 40 of the celebrities who had been targeted. No one wanted to participate in our film, for fear of driving traffic to the abusive videos online.

So when my colleagues stumbled across a deepfake porn video of me, I decided to become my own “case study”. I wanted to test out the effect of being deepfaked, and agreed to be filmed watching the video for the first time.

I cover wars, natural disasters and personal tragedies on an almost daily basis so I thought I’d be braced for anything as I prepared to confront my own deepfake.

The video was a parody of myself. It was recognisably my face, but expertly superimposed on someone else’s naked body. Strangely, my curly hair – one of my more defining characteristics – had been replaced. At that point, at least, AI struggled to replicate ringlets.

Most of the video was too explicit to show on television. I didn’t want to watch to the end but forced myself to see every frame. And I found it utterly dehumanising.

It felt like a violation to think that someone unknown to me had forced my AI alter ego into an array of sexual activities. The video has haunted me since, not least because whoever has abused me in this way is beyond the reach of the limited sanctions currently available.

After our investigation, the government announced that those making these images would be liable to prosecution, but the law fell at the election. The new government has, however, committed to reviving it. But the truth is politicians and regulators around the world are being outpaced by the technology and the perpetrators using it to abuse women.

Deepfake porn is proliferating on a scale few have begun to grasp.
Our team discovered that more deepfake porn videos were created in 2023 than all other years combined. On the top sites alone, videos have been viewed more than 4.2 billion times. To put that in context, that’s almost three times the number of people who watched the last football World Cup final.

While high-profile victims have a platform to try and effect change, for most the consequences are terrible.

31-year-old florist Sophie Parrish discovered that someone close to her family had taken photos of her from Facebook and Instagram and uploaded them to a website where men share deepfake nudes and post degrading comments and pictures of them masturbating over the photos.

Sophie told me she was physically sick when she saw the images. When I spoke to her she was still dealing with the repercussions.
“I trusted everybody before this. My wall was always down…But now I don’t trust anybody…It’s horrible. I look at everyone now and I think: what are they doing and why are they looking at me?” she said.

The sharing of deepfake porn was already outlawed when the new offence was proposed, but the broadcasting watchdog Ofcom took quite some time to consult on the new rules. The new Ofcom “illegal harms” code of practice setting out the safety measures expected of tech platforms won’t come into effect until April.

And in the meantime, this kind of content makes money for big tech. Almost a month after we alerted several of the key companies to my video, it could still be found in seconds by a quick search.

Perhaps 2025 will see change, with decisive action taken against those who weaponise AI to abuse and degrade women and girls.