Australia Passes Deepfake Legislation, Criminalizing Creation and Sharing of Explicit Content

Australia has taken a significant step in combatting the proliferation of sexually explicit deepfake content with the passing of the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024. The legislation, introduced by Attorney-General Mark Dreyfus in June and accepted on August 21, aims to impose severe penalties on individuals involved in the creation and dissemination of deepfake pornography using artificial intelligence technology.

The new law addresses the alarming trend of using AI software and apps to generate pornographic material by superimposing the likeness of victims or their faces onto explicit content without their consent. Dreyfus emphasized that women and girls are often targeted and degraded through such malicious acts. The bill strengthens existing Commonwealth Criminal Code offenses and introduces an aggravated criminal offense for sharing non-consensual deepfake sexually explicit material.

Under the legislation, individuals found guilty of sharing non-consensual deepfake content may face imprisonment for up to six years. Moreover, if the person responsible for creating the deepfake also shares it without consent, they could face a higher penalty of up to seven years’ imprisonment. The law also encompasses the sharing of real images that have been disseminated without consent, ensuring a comprehensive approach to combatting this issue.

Labor Senator Murray Watt highlighted that the new criminal offenses are based on a consent model, covering both artificial and real sexual material. However, Shadow Attorney-General Michaelia Cash expressed concerns about certain aspects of the bill, including the potential cross-examination of victims in court.

The deepfake legislation builds upon the Australian government’s ongoing efforts to address cyberbullying and harm. These initiatives include increased funding for the eSafety commissioner, an early review of the Online Safety Act, and a commitment to tackle practices like doxxing.

The rise in technological capabilities has led to a surge in deepfake images online, with a significant portion being sexually explicit. A parliamentary inquiry revealed that 90 to 95 percent of deepfakes involve non-consensual pornography, and an overwhelming 99 percent of victims are women. Apps enabling the creation of such content have become increasingly common, as noted by eSafety Commissioner Julie Inman Grant.

South Australian Senator Kerrynne Liddle emphasized the potential harm caused by deepfakes, highlighting cases where criminals have extorted money using such content. Liddle, who serves as the shadow spokesperson for child protection, stressed the need for increased education to ensure young people understand the harm caused by deepfakes and do not become victims or perpetrators themselves.