Kate Chaney Urges Stronger Laws on AI Tools Driving Child Abuse Material (3 Sept 2025)
Sally Sara:
And a warning, the following story contains some distressing content. The federal government says it's looking to crack down on undetectable online stalking tools and notification apps that use generative artificial intelligence. It comes as a roundtable on AI-generated sexual abuse material was held in Canberra yesterday, with victim survivors and advocates highlighting the urgent need for action. Independent MP Kate Chaney was at the round table. I spoke with her earlier. Welcome back to Breakfast.
Kate Chaney:
Thank you very much, Sally.
Sally Sara:
What did you hear from victims and advocates about the impacts these kinds of tools can have, the online stalking tools, the notification apps and others?
Kate Chaney:
It was pretty horrendous. The focus yesterday was around child sexual abuse. Law enforcement agencies are completely overwhelmed by the volume, and the severity is getting worse. These tools can be used to create horrendous situations that desensitise offenders, which can increase the severity of crimes committed against real children.
Sally Sara:
What are some of those who work in this area telling you about what it’s like being on the front line of all this? Because it sounds like with some of these tools, it is increasing exponentially.
Kate Chaney:
We heard terrible stories from a Victorian policewoman in the victim identification team. She might open 30 cases in a day and has to choose which two or three she has time to investigate properly and which children to rescue. Identification has become much more difficult because they have to determine whether this is a real child or AI-generated material, which still originates with victims. They don’t have the AI tools that could help, so they have to watch the content.
She told me about listening to the screams of a toddler repeatedly, trying to hear the TV program in the background to identify which state it was in. Law enforcement desperately needs more resources and AI tools to counter these behaviours.
Sally Sara:
Last time we were talking, you made the point that even if this is AI-generated material, it’s been trained on real material. But also, as you’re saying, this AI material can push offenders further into this horrific world.
Kate Chaney:
That’s exactly right. The minister’s announcement yesterday about restricting access to nudify apps is fantastic. My bill dealt with criminalising possession or downloading of apps that generate child abuse material. These tools are specifically designed for that purpose. Perpetrators can train the tools with images of real children, then delete the images. They can generate abuse images with word commands and delete them each time. This is a gap in the criminal law that I want to address, to make it an offence to download these tools.
Sally Sara:
Do you think the government should go further than the announcements yesterday?
Kate Chaney:
Yes. The announcement yesterday is about access and part of the regulatory framework. We need to harmonise criminal law with that framework and hold people accountable for downloading these tools with the intent of creating child sexual abuse material. Nudify apps are broader, but there’s still a gap in criminal legislation.
Sally Sara:
What do we know about how widespread these kinds of offences are?
Kate Chaney:
There were 78,000 images sent to victim identification teams through the International Centre for Missing and Exploited Children. That doesn’t include other referral channels. A small team is dealing with this massively increased level of depravity, volume and speed. They really need AI tools to help with investigations, such as facial or voice recognition, and to review device content.
Sally Sara:
Some of these apps have also been used by children to bully other children. Should kids face criminal penalties if they’re creating this kind of material?
Kate Chaney:
There’s a difference between tools specifically designed to generate child sexual abuse material and broader nudify apps. Absolutely, we need to restrict access to nudify apps.
Sally Sara:
And nudify is to create fake nude pictures of someone else, perhaps using their headshot and putting those things together. Is that right?
Kate Chaney:
That’s right. You can feed in a picture of someone and it generates fake nude images, which are being used on school buses and by kids. Schools don’t know how to respond. Some see incidents weekly. The International Centre for Missing and Exploited Children Australia is asking for more education and support for schools to deal with this.
Sally Sara:
How important is it to tread carefully with these responses, but also move quickly because the tech is moving fast?
Kate Chaney:
The tech is moving faster than parliament has ever responded. That’s the big challenge of this term. Getting it right is important, but we can at least start with low-hanging fruit. My bill criminalising these tools is that low-hanging fruit. I hope the Minister acts quickly on yesterday’s announcement.
Sally Sara:
Do you think it’ll get up?
Kate Chaney:
I hope so. It’s really important. Every week we delay, more children suffer harm.
Sally Sara:
Finally, on a separate issue, the government is facing ongoing pressure to implement the Murphy report into gambling harms. Have the crossbenchers had any indications from government about what’s happening?
Kate Chaney:
Not very promising. I believe there is some desire to act, but when I’ve asked the Prime Minister, he’s listed actions not covered by the Murphy Report. I’m concerned the government is listening more to big money in sport, media, and gambling companies than to the community. There is a desperate need to ban gambling ads, as we did with tobacco.
Sally Sara:
Are you concerned nothing will be done in this term?
Kate Chaney:
I am deeply concerned it will be put in the too-hard basket. The report was released more than two years ago, and the government hasn’t even issued a written response. I think Peter Murphy would be disappointed by this lack of action. I’m not going to let it go. I’ll keep pushing, because it’s what the community wants.
Sally Sara:
Kate Chaney, thank you for your time this morning.
Kate Chaney:
Thanks, Sally.
Sally Sara:
That’s Independent MP Kate Chaney. And if you need help, you can contact Lifeline Australia on 13 11 14. If it’s an emergency, call triple zero. Or you can call 1800-RESPECT on 1800 737 732.