Australia’s eSafety Commissioner has already received a number of complaints about non-consensual distribution of deepfake intimate images, and expects this type of abuse to grow in volume as artificial intelligence (AI) technology becomes more accessible.
“Looking ahead, I’m concerned AI-related harms may morph and combine with those we’re also starting to see in the metaverse, especially harms affecting children,” Commissioner Julie Inman Grant says.
“There is potential, for example, for generative AI to automate child grooming at scale and in a highly personalised way.”
A deepfake is a false, but seemingly realistic photo, video or sound file made using AI.
While some uses are relatively harmless, such as the deepfake images of Pope Francis wearing a puffer jacket, the technology can also be used for sinister purposes such as manipulation and abuse.
The Australian Government is currently consulting on how best to address the potential harms and risks from generative AI, including issues such as the potential for deepfake misinformation and abuse, risks to privacy, bias and lack of transparency.
Grant warns about the potential for AI-generated sexual abuse material, including material based on real images of children sourced online.
“We are already aware of paedophiles scraping children’s images from social media and using generative AI to create child sexual abuse imagery according to their predatory predilections,” Grant says.
A variety of generative AI tools – image, video, text and more – have been released without safety guardrails in place to prevent this kind of abuse.
Grant is calling for the technology industry to prioritise safety from the outset.
Having consulted broadly with Australian and global AI experts, the Commissioner’s office says its next tech trends brief will address the safety implications and mitigations needed around generative AI. This will include safety advice for industry and the public.
Grant encourages Australians suffering any kind of image-based abuse including deepfakes to report it to eSafety.gov.au.