A shocking investigation reveals that nearly 4,000 celebrities, including 255 from the UK, have fallen victim to deepfake pornography, sparking outrage and calls for tighter regulation on AI usage.
An investigation by Channel 4 News has uncovered that nearly 4,000 celebrities, including 255 from the UK, have been victims of deepfake pornography. Deepfake technology involves artificially superimposing individuals’ faces onto explicit content. Among the British victims is Channel 4 News presenter Cathy Newman, who described her experience as “sinister” and unsettling.
This revelation comes amidst growing concerns over the misuse of artificial intelligence (AI) for creating and spreading deceptive imagery. The Online Safety Act, implemented in the UK on January 31, makes sharing non-consensual deepfake imagery illegal, though the creation of such content is not. This legislation is part of broader efforts to protect individuals from the harmful impacts these digital violations carry.
Victims, including Sophie Parrish from Merseyside, expressed the distress and degradation experienced upon discovering these digitally fabricated images. Parrish’s testimony highlights the dark side of deepfake technology and its potential to harm individuals’ wellbeing and dignity.
The investigation also pointed to the role of major tech platforms and search engines in facilitating access to deepfake content, with companies like Google and Meta promising to improve protections against the spread. Caroline Nokes, Chairwoman of the Women And Equalities Committee, criticized the horrific practice of targeting individuals, particularly women, with such content.
Enforcement of the Online Safety Act, with Ofcom at the helm, aims to curb the circulation of illegal content and protect victims. The case of deepfake pornography underscores the complex challenges posed by the advancement of AI in creating realistic fake content and the ongoing need for robust legal and social safeguards.