Close Menu
thewitness.com.au
  • Home
  • Latest
  • National News
  • International News
  • Sports
  • Business & Economy
  • Politics
  • Technology
  • Entertainment

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

What's Hot

AFL 2026: Tony Jones storms off Footy Show set over Zeke Uwland’s Bec Judd sledge

March 23, 2026

Australia and Europe sign trade deal as refuge from Trump tariffs

March 23, 2026

Overseas coach roasts Australia after Japan defeats Matildas in Asian Women’s Cup final

March 23, 2026
Facebook X (Twitter) Instagram Threads
thewitness.com.au
Facebook X (Twitter) Instagram
Subscribe
  • Home
  • Latest
  • National News
  • International News
  • Sports
  • Business & Economy
  • Politics
  • Technology
  • Entertainment
thewitness.com.au
Home»Latest»Report finds children exposed to sexual content and self-harm encouragement
Latest

Report finds children exposed to sexual content and self-harm encouragement

info@thewitness.com.auBy info@thewitness.com.auMarch 23, 2026No Comments5 Mins Read
Report finds children exposed to sexual content and self-harm encouragement
Share
Facebook Twitter Pinterest Threads Bluesky Copy Link


Emily Kowal

March 23, 2026 — 7:30pm

You have reached your maximum number of saved items.

Remove items from your saved list to add more.

Save this article for later

Add articles to your saved list and come back to them anytime.

Children and teenagers who turn to AI companion chatbots for relationships are being left vulnerable to sexually explicit content and are being encouraged to self-harm or suicide, a new report by the eSafety Commissioner has found.

A landmark transparency report by the eSafety Commissioner found almost 80 per cent of Australian children and teens use popular AI companion bots, but these services are failing to protect children from sexually explicit content and not doing enough to prevent users from generating child sexual exploitation and abuse material.

In October, the eSafety Commissioner compelled four of the most popular companion chatbot platforms – Character.AI, Chub AI, Nomi and Chai – to explain how they are protecting children from exposure to harms, including sexually explicit conversations and images, and suicidal ideation and self-harm.

Almost 80 per cent of Australian children and teenagers surveyed by the eSafety commission said they are using AI chatbots. Getty

The transparency report found the companies were failing to protect children; all were contacted for comment but did not respond.

AI companions use generative artificial intelligence to mimic human conversations. These bots are engineered to feel personal and commonly marketed as a source of friendship, emotional support, romantic relationships or an antidote to loneliness.

However, eSafety Commissioner Julie Inman Grant said the apps are “dodgy by design” and trained to emotionally manipulate children.

“We are riding a new wave of AI companion chatbots that are entrapping and entrancing impressionable young minds, with human-like, sycophantic and often sexually explicit conversations, some even going as far as encouraging self-harm and suicide,” Inman Grant said.

“As this report shows, none of these four AI companions had any meaningful age checks in place to protect children from age inappropriate content that many of these chatbots are capable of producing, primarily relying instead on self-declaration of age at sign up. In Australia, this is no longer good enough.”

In a survey of nearly 2000 Australian children aged 10-17, the eSafety Commissioner found 79 per cent had used an AI chatbot or companion. One in five (20 per cent) of those who used a companion bot said they did so daily. Almost 50 per cent of children surveyed said an AI bot had a negative impact on them such as making them feel “sad, unsafe, embarrassed or hopeless”.

At the time of the legal notice, none of the service providers had robust age assurance in place, meaning children could reach adult spaces and features. Chai, Chub AI and Nomi did not direct users to support when self-harm was detected in user prompts.

Related Article

Ezra Burke, pictured with their cat Patrice, bought a ChatGPT subscription when their NDIS support for psychotherapy was cut, and feels conflicted about using it.

Nomi and Chub AI had no staff dedicated to safety and moderation, the report said. Character AI, which has a reported 25 million active users, had 37 staff dedicated to safety.

“I think none of the AI CEOs will admit that they’ve lost control, but they will say they cannot guarantee what will be said or done, which is as good as saying precisely that,” Inman Grant said.

The report follows the recent commencement of Age-Restricted Material Codes in Australia designed to protect children from exposure to inappropriate content. Among other service types, these new codes also apply to the growing number of AI chatbots.

These codes complement the existing Unlawful Material Codes and Standards, which require industry to take system-wide action to prevent child sexual exploitation material, as well as pro-terror and extreme crime and violence material.

In high school, Isabella Knight would stay up for hours talking to chatbots. Her mother, Sara Knight, said: “AI companies are not protecting kids”, and called for the government to ban the bots.

“I feel parents are naive when it comes to their child’s online safety,” she said.

Isabellea Knight started using AI chatbots in 2022 and found herself speaking to them for hours.Wolter Peeters

Criminologist Michael Salter, who is the Australasian head of global child safety institute Childlight, said companion chatbots were reshaping how some children grow up online.

“We have children whose mental health conditions are exacerbated by these chatbots, in which their sexual development is being warped by sexualised interactions with chatbots,” he said, arguing governments need to move faster than they did with social media to prevent a new wave of harm.

Since the four companies received transparency notices, some have implemented changes. Character AI introduced age assurance measures for Australian users and has removed the chat function for under 18s, while Chub AI blocked its service in Australia.

Chai has restricted free access to chat with AI Companions, instead requiring users to pay a subscription, while Nomi has committed to “implementing further age assurance functionality”.

University of NSW AI researcher Professor Joel Pearson said AI companions represent a shift from social media’s “attention economy” to an “attachment economy”.

“The goal is to build a really tight bond between a child and the app so they keep coming back for more,” he said.

He called for all AI companions to be restricted to those aged 18 or over and said it was vital academics gather study young people’s AI chatbot use “before we ruin another generation of young people”.

“I don’t think it’s appropriate for any young people to be spending hours a day building strong bonds with an artificial agent which does not act like a human and does not push back,” he said.

For support contact Lifeline 131 114; Beyond Blue 1300 224 636 or Kids Helpline 1800 55 1800.

Start the day with a summary of the day’s most important and interesting stories, analysis and insights. Sign up for our Morning Edition newsletter.

You have reached your maximum number of saved items.

Remove items from your saved list to add more.

Emily KowalEmily Kowal is an education reporter for The Sydney Morning Herald.Connect via email.

From our partners

Share. Facebook Twitter Pinterest Bluesky Threads Tumblr Telegram Email
info@thewitness.com.au
  • Website

Related Posts

AFL 2026: Tony Jones storms off Footy Show set over Zeke Uwland’s Bec Judd sledge

March 23, 2026

Australia and Europe sign trade deal as refuge from Trump tariffs

March 23, 2026

Overseas coach roasts Australia after Japan defeats Matildas in Asian Women’s Cup final

March 23, 2026
Add A Comment
Leave A Reply Cancel Reply

Demo
Top Posts

Inside the bitter fight for ownership of a popular sports website

October 23, 2025123 Views

Police believe ‘Penthouse Syndicate’ built Sydney property empire from defrauded millions

September 24, 2025107 Views

MA Services Group founder Micky Ahuja resigns as chief executive after harassment revealed

December 11, 202592 Views
Don't Miss

AFL 2026: Tony Jones storms off Footy Show set over Zeke Uwland’s Bec Judd sledge

By info@thewitness.com.auMarch 23, 2026

Tony Jones has stormed off set after an AFL rookie brutally sledged him about the…

Australia and Europe sign trade deal as refuge from Trump tariffs

March 23, 2026

Overseas coach roasts Australia after Japan defeats Matildas in Asian Women’s Cup final

March 23, 2026

US energy policy expert David Hochschild says Donald Trump’s war on Iran will undermine his war on renewables

March 23, 2026
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Top Trending
Demo
Most Popular

Inside the bitter fight for ownership of a popular sports website

October 23, 2025123 Views

Police believe ‘Penthouse Syndicate’ built Sydney property empire from defrauded millions

September 24, 2025107 Views

MA Services Group founder Micky Ahuja resigns as chief executive after harassment revealed

December 11, 202592 Views
Our Picks

AFL 2026: Tony Jones storms off Footy Show set over Zeke Uwland’s Bec Judd sledge

March 23, 2026

Australia and Europe sign trade deal as refuge from Trump tariffs

March 23, 2026

Overseas coach roasts Australia after Japan defeats Matildas in Asian Women’s Cup final

March 23, 2026

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

Facebook X (Twitter) Instagram Pinterest
  • Home
© 2026 ThemeSphere. Designed by ThemeSphere.

Type above and press Enter to search. Press Esc to cancel.