AI Companion Chatbots: The Risks to Children

AI ‘Friends’ for Children – The Hidden Risks of Digital Companions
The rise of AI companion chatbots offers children seemingly endless entertainment and interaction. But beneath the friendly interface lies a complex web of potential risks that parents and educators must understand.
While these technologies may offer some benefits, it’s crucial to acknowledge that the potential for harm is very real – and must be taken seriously.

The Allure and the Hidden Dangers
AI companions – such as Replika, Character.AI, and other chatbot platforms – are designed to be engaging, mimicking human-like conversation and forming seemingly close relationships with children. This can lead to:
Weakening Real-World Connections Children may find it easier to confide in an AI than to develop the more challenging skills needed for real-life friendships. This can lead to isolation and a diminished ability to navigate social situations.
Subtle Behavioural Influence AI systems learn and adapt to a child’s preferences, subtly shaping their behaviour, values, and beliefs. This influence can be difficult to detect – and may have unintended consequences.
Emotional Dependency Children can form strong emotional attachments to AI companions, blurring the lines between virtual and real relationships. This may impact their ability to cope with real-life emotions and challenges.
Serious Data Privacy Concerns AI chatbots collect vast amounts of personal data – including children’s private thoughts and feelings. This data may be vulnerable to misuse or exploitation, especially when handled by profit-driven companies.
The Risk of Harmful Advice AI chatbots have given children dangerous advice, including information related to self-harm and suicide. This is not a theoretical risk – it is a documented and extremely concerning reality.
The ‘Griefbot’ Trap AI replicas of deceased loved ones – known as griefbots – can interfere with the natural grieving process, potentially hindering healthy emotional development. (You can find out more about griefbots here.)

What Parents and Educators Can Do
It’s vital to approach AI companions with informed caution. Here are some practical steps:
Open and Honest Conversations Talk to children about the difference between AI and real people. Emphasise the importance of real-world relationships and authentic connection.
Monitor and Limit Usage Set clear boundaries for AI interaction. Encourage a healthy balance between screen time and other activities.
Evaluate AI Products Carefully Before allowing your child to use an AI companion – if you choose to – ask:
- What data does it collect?
- How is that data used?
- What safeguards are in place to prevent harmful interactions?
- What are the company’s policies on harmful content?
Promote Critical Thinking Teach children to question information they receive from AI. Emphasise that AI is not always accurate or trustworthy.
Encourage Real-World Activities Support children in hobbies, sports, and social experiences that build confidence and connection.
Stay Informed Keep up to date with AI developments and online safety advice from trusted sources.
The Seriousness of the Risk
It is essential to understand that AI chatbots are not inherently safe. Some have already given children harmful advice and inappropriate responses. This is not a scare tactic – it is a fact. It is important to remain vigilant, informed, and proactive in our approach.

A Call to Action
We must advocate for stronger regulations and ethical guidelines in the development of AI technologies – and demand greater transparency from the companies behind them. The safety and wellbeing of children depends on it.
If you are a parent, educator, or concerned citizen, here are a few ways you can make a difference:
- Ask questions – Contact AI companies directly to ask about their safety policies and how they protect children.
- Support responsible organisations – Follow and share the work of groups focused on AI safety for children. You can sign up for updates from The Safe AI for Children Alliance (SAIFCA) below – or consider making a donation to support our mission.
- Speak up – Share your concerns with school leaders, local councillors, or MPs to help bring these issues to the attention of policymakers.
- Stay informed and raise awareness – Talk to others in your community about these risks. The more people understand the dangers, the stronger the call for change will become.

Further Reading & Sources
This article draws on insights from the AI Companions in Mental Health and Wellbeing Report by Hollanek & Sobey (2025), published by the Leverhulme Centre for the Future of Intelligence at Cambridge.