Christina Trevanion, a prominent auctioneer and expert on the BBC’s Flog It, recently opened up about her distressing encounter with deepfake technology during an appearance on Morning Live. The incident, which occurred last September, involved the unauthorised use of her likeness in explicit deepfake videos, a growing concern in the realm of digital security and privacy.

During her emotional account, Trevanion described her longstanding experience in the public eye, stating, “I’m used to living life in the public eye. Often their reaction from the public has been kind and sweet and supportive, but over the last couple of years there’s been a noticeable shift and at times it can be quite intrusive.” Her revelation highlights the dark side of fame, as she explained how she received a lengthy list of hyperlinks directing her to various pornographic videos where her head had been digitally superimposed. The profound impact of the experience left her feeling “naive, and stupid and utterly violated in every single way.”

The rise of deepfakes, which leverage artificial intelligence to create altered multimedia content often with malicious intent, is beginning to overshadow the legal landscape. Trevanion’s ordeal aligns with a broader issue affecting many women who have found themselves victims of similar abuses. Another victim, identified only as Jodie, recounted her own traumatic experience with deepfakes, describing the emotional turmoil it caused. “I just felt like my whole world shattered around me,” she noted, expressing fears regarding her relationships and her life’s future as she battled with the potential public perception of these false representations.

As deepfake technology has proliferated, legal responses have lagged. In the UK, while it is unlawful to share or threaten to share intimate images without consent, there is currently no specific legislation addressing the creation of deepfakes. This gap in the law has prompted Baroness Charlotte Owen to advocate for change. Speaking about the government’s hesitance to legislate against deepfakes, she cited a 2022 report from the law commission, which did not categorically view the creation of such images as a serious enough offence, arguing, “if someone doesn’t know about it, there’s no harm caused.”

Baroness Owen is advancing legal amendments in Parliament that aim to address the non-consensual creation of explicit images, making it clear that consent is paramount. “The bottom line should be if a woman does not consent, that should be enough,” she asserted. The proposed legislation seeks to classify deepfakes as acts of abuse, thereby establishing potential penalties, including fines and imprisonment for offenders.

Despite Trevanion’s efforts to remove most of the deepfake content infringing on her likeness, she acknowledges that the emotional scar remains. “It’s something I will always have hanging over me and other victims,” she remarked. The experience serves to underscore the ongoing struggle against the misuse of technology, while also highlighting the need for robust legal frameworks to protect victims of deepfake abuse.

The conversation around deepfakes continues to gain traction, as individuals and organisations rally for improved legal protections in an era where technology can easily blur the lines of consent and reality.

Source: Noah Wire Services