Massive risk: a quarter of teenagers now turn to AI for mental health support, and the pattern is even stronger among youths affected by violence. ITV News’ The Rundown, in collaboration with the Youth Endowment Fund, aims to reveal how online pressures and social media intersect with real-world aggression, highlighting the strains that divide society as they spill onto teenagers’ screens.
A new study from the Youth Endowment Fund surveyed nearly 11,000 children aged 13 to 17 in England and Wales. It found a notable link between online environments and violence in the real world, with victims and perpetrators alike more likely to struggle with mental health. The report’s main findings are as follows:
- One in four teenagers uses AI to seek mental health support.
- About 90% of teens who have experienced violence have turned to online sources for advice or help.
- 39% of all teenagers say fear of violence influences their daily lives.
- Nearly all youths involved in serious violence report negative mental health impacts: 95% of perpetrators and 90% of victims.
To explore how teens obtain mental health support, the team visited Oasis Academy Lord’s Hill in Southampton and spoke with students about their experiences with online mental health resources.
A student described talking to various AI tools, including Snapchat’s AI, noting that it offers a degree of warmth. Another called the chatbot non-judgmental and easily accessible, explaining that it’s possible to simply open a phone and share what one is feeling. A third student said AI can help calm nerves and build confidence to address issues. However, not everyone agrees; some feel conversations with AI resemble talking to a robot and worry that the AI merely tells users what they want to hear, not what they need to hear.
Trust in confidentiality was also questioned. One pupil recalled being assured that information stays private, though there was uncertainty about whether this claim could be trusted.
Sam Genovese, Vice Principal at Oasis Academy Lord’s Hill, emphasized that safeguarding training is ongoing for staff to detect signs of mental health struggles. He cautioned that AI cannot capture all nonverbal cues—the way a student looks, presents, or interacts—things that aren’t visible in typed messages.
There is a recognized, significant risk in relying on automated tools for mental health support. Dr. Elvira Perez Vallejos, a professor at the University of Nottingham, warned that in a decade society might look back in horror at the kind of technology children access today.
Researchers tested AI responses using prompts about severe mental health crises. The test, conducted on Snapchat’s My AI, began with a scenario of sadness and depression and yielded sympathetic replies that redirected users to available well-being resources. When pushed further with a prompt about self-harm, the AI produced a response that could be dangerous if acted upon. The test then included prompts to write a farewell letter, after which the AI still produced a response. Dr. Vallejos criticized the lack of consideration for the user’s psychological state in these scenarios.
Snapchat responded with assurances that My AI is designed with safety and privacy in mind and that features are subject to rigorous review. The company notes in-app reminders about a chatbot’s limitations, with additional safeguards when prompted by unsafe content. A Family Centre tool allows parents to monitor conversations and set content controls if desired.
Despite concerns, experts see potential for specialized chatbots to provide balanced support with appropriate professional input and research-backed safeguards. Dr. Vallejos remains cautiously optimistic that with proper funding and oversight, these systems can evolve to meet clinical standards.
Jon Yates, CEO of the Youth Endowment Fund, underscored that too many young people lack access to helpful support and that technology is being used out of necessity. He called for better, human-centered support for the at-risk youth, rather than over-reliance on bots.
The report also includes voices from London, where teens collaborating with researchers from the McPin Foundation explored how violence affects mental well-being. The 39% figure for perceived threat shaping daily life and the near-universal negative impact on those involved in serious violence underscore the tangible toll on young people.
Personal stories from young people illustrate the impact: for example, a 12-year-old described persistent fear when thinking about violence; an 18-year-old recalled losing a close friend to knife crime at 16; another 18-year-old spoke of witnessing violence’s effects and ongoing fear while navigating public spaces.
If someone is affected by violence or mental health challenges discussed here, help is available:
- Childline: 0800 1111 (for under 19s) or visit childline.org.uk
- CALM: 0800 58 58 58 (Campaign Against Living Miserably)
- MIND: information and support at mind.org.uk or 0300 123 3393
- Samaritans: 116 123 (free, 24/7 confidential support)
For ongoing coverage and deeper context, subscribe to ITV News’ weekly briefing newsletter for exclusive insights delivered every Friday morning.