Deepfake abuse is here

We need an urgent and radically honest response to the Bacchus Marsh Grammar deepfake incident

Yesterday, a boy from Victoria was arrested after AI was used to create and distribute child sex abuse material of at least 50 of girls from private school Bacchus Marsh Grammar. Photos were taken from the girls’ social media accounts, deepfake tech was used to create pornographic material featuring their faces, and the images were uploaded to Instagram and Snapchat. The content was so graphic one of the parents from the school told the ABC: “I almost threw up when I saw it.”

As I’m writing this, the police investigation is ongoing. An adult convicted of producing child abuse material would face maximum penalties of 10 years prison under Victorian law and 15 years under Commonwealth law. 

The story is chilling but unsurprising. So, what do we do?

Firstly, we all need to be on the same page about how readily accessible AI deepfake tools are. Apps and websites that use AI to make explicit nonconsensual images are not confined to shadowy corners of the dark web – they are so easily surfaced by search engines that Google has recently started downranking these sites in search results; are advertised on Instagram; and use Patreon to sell ‘custom requests’. In January 2023, popular Twitch streamer Atrioc (real name Brandon Ewing) accidentally revealed during a livestream that he was watching deepfake sexual abuse material of female Twitch streamers – women who were his friends. He said he found the content via an ad on PornHub (which gets more than 5 billion visits per month, according to Semrush).

Most of the reporting on this story has quoted the pioneering 2019 report by Deeptrace Lab, which found 96 per cent of all deepfake videos online are pornographic and non-consensual (from the perspective of the adult performer too, whose work is used as the basis of this content without permission or payment). While confronting, that figure alone doesn’t give us the full scale of the problem. The report also found: 96 per cent of these videos target women, and the top four deepfake websites received more than 100 million views across their videos.

This is a gendered form of abuse. It requires us to have a more nuanced conversation about pornography, abuse and gender than the one we’re having.

Right now, some voices insist that porn is being used as a scapegoat for gendered violence in the same way that video games are often wrongly blamed for causing violence. There is truth here – issues within the porn and sex work industry are the product, not the cause, of patriarchal systems. If every porn site was deleted off the Internet tomorrow men’s violence against women would not be eliminated alongside them.

But unlike gaming, research does show links between pornography consumption and intimate partner violence. QUT researcher Maree Crabbe says porn consumption is also linked to “risky sexual behaviours, greater sexual objectification of women, rape myth acceptance, and sexual coercion and aggression.” We cannot deny that the era of plentiful and free Internet pornography (distinct from the DVD and magazine eras that came before it) is playing a role in maintaining toxic masculinity.

In the same vein, deepfake technology is not at the root of these problems, but is already playing its role as a tool being used almost exclusively to abuse and intimidate women and girls. 

We need more technology-literate governments to address these issues. Legislation and regulation are crucial, and yet the solutions being offered up won’t help at all. Banning kids under 16 from social media will not work. Deepfake image-based abuse is already in breach of content policies for mainstream digital platforms. Requiring age-verification for online porn sites poses huge privacy risks… and also will not work. The negative impacts of porn use are mitigated by individual factors, so there is no way around it: parents and teachers need to facilitate healthy conversations around sex, gender and power to educate young people and identify kids who might be at greater risk of committing these crimes.

Australia already has the laws in place – this is a crime. Where we’re failing, like with all sex-related offences, is within the justice system. As consent activist Chanel Contos pointed out in her National Press Club address last year, the current system presents only two options: “no consequences or jail time.” Neither is particularly helpful for victims of image-based abuse – deepfake or not – who have been harmed regardless. What would it look like to take a victim-centred justice approach? What do the Bacchus Marsh Grammar girls and their families want? A justice solution that puts their needs first would be the most revolutionary move we could make.

"It requires some imagination to visualise pterosaurs at sea, hunting fish and squid-like creatures alongside massive marine reptiles millions of years ago, in what is now the dry Australian outback. But the process is made easier with the fossils in front of you."

PhD candidate Adele Pentland led a research team from Curtin University which identified a new species of flying reptile, with a 4.6m-wide wingspan, as Haliskia peterseni (Petersen’s sea phantom).

“A lot of companies don’t think with a wartime mentality in peacetime.”

Canva co-founder Cliff Obrecht on employees who waste his time.

53%

The Year 12 attainment rate in Tasmania, well below the national average of 76%.

Nationals MPs call for Paris Agreement to be abandoned.

While the Liberal party clarifies its position on abandoning the 2030 emissions targets, some in the Nationals party are pushing to leave the Paris agreement entirely.

Subscribe now to the newsletter, delivered to your inbox each day at 4pm