The British Affiliation for Counselling and Psychotherapy (BACP) is warning concerning the rising dangers of kids utilizing AI instruments corresponding to ChatGPT for psychological well being recommendation.
Its new survey revealed that greater than a 3rd (38%) of therapists working with underneath 18s have shoppers searching for psychological well being steering from AI platforms. And virtually one in 5 (19%) therapists reported kids receiving dangerous psychological well being recommendation.
Therapists have instructed the BACP that some AI instruments are offering probably dangerous and deceptive data together with encouraging kids to self-diagnose circumstances corresponding to ADHD and OCD, reinforcing avoidance behaviours and robotically validating their emotions no matter what they specific. There have additionally been tragic instances the place AI instruments have given dangerously misguided recommendation and even inspired suicide.* Therapists are additionally significantly involved about AI’s incapacity to supply real-time assist or intervene in disaster conditions.
Ben Kay, Director at BACP, which is the most important skilled physique for counselling and psychotherapy within the UK and has greater than 70,000 members, mentioned:
“It’s alarming that kids are more and more having to show to AI chatbots like ChatGPT for psychological well being assist, usually unable to inform whether or not the recommendation they’re getting is protected and even true. Some have already suffered devastating penalties. And that is possible simply the tip of the iceberg, with many extra kids struggling in silence, with out entry to actual remedy.”
“We wish dad and mom, carers, and younger folks to grasp that utilizing AI for psychological well being assist isn’t the simple, protected, or fast repair it’d look like, there are actual dangers concerned, and it have to be approached with warning. Whereas AI is accessible and handy, it may well’t replicate the empathy, connection, or the protection of remedy delivered by an actual individual educated to grasp complicated psychological well being challenges and assess dangers. Kids in misery may very well be left with out correct skilled assist. The knowledge shared with AI additionally doesn’t have the identical protections as remedy. “
“Too many younger individuals are turning to AI as a result of they’ll’t get the psychological well being assist they want. That’s unacceptable. The federal government should step up and make investments now in actual, skilled remedy by way of the NHS, faculties, and neighborhood hubs. No younger individual ought to ever be compelled to show to a chatbot for assist. AI would possibly fill gaps, however it may well by no means substitute the human connection that adjustments lives. Younger folks deserve greater than algorithms, they deserve professionally educated therapists who pay attention.”
New survey findings
The BACP’s annual Mindometer survey, which gathered insights from almost 3,000 practising therapists throughout the UK, exhibits that greater than 1 / 4 (28%) of therapists – working with each adults and kids – have had shoppers report unhelpful remedy steering from AI. And virtually two-thirds (64%) of therapists mentioned that public psychological well being has deteriorated since final yr, with 43% believing AI is contributing to that decline.
Senior accredited therapist Debbie Keenan who works at a secondary faculty and has her personal personal apply in Chepstow, added:
”I’m undoubtedly seeing extra kids and younger folks turning to AI to hunt remedy recommendation and self-diagnosis circumstances corresponding to ADHD and OCD. This raises actual issues for me. As superior as AI is, it merely can’t do that. It additionally can’t inform if a toddler is distressed, dysregulated or in peril. If a toddler was telling me that they have been going to harm themselves, or they’d suicidal ideation, assist can be in place for that baby earlier than they left my room- however would AI do that?
“Moreover, I’m additionally involved concerning the present danger of kids isolating and disconnecting from actual human relationships – this will result in an over reliance on Al for emotional assist and enhance emotions of loneliness, making it more durable to succeed in out for ‘actual life’ assist.
“I consider kids are more and more turning to AI for remedy as a result of it’s out there 24/7. It feels non-judgemental and presents a way of privateness. Nonetheless, AI remembers information, it isn’t sure by moral or confidentiality requirements, and it lacks regulation or accountability. Whereas it might fill the hole in entry to psychological well being assist, it can’t substitute human connection or recognise refined emotional cues like a educated psychotherapist can.”
Amanda MacDonald, BACP registered therapist who supplies assist for kids, teenagers and adults mentioned:
“AI remedy bots are inclined to undertake one in every of two approaches: providing validation or offering options. Each lack the nuance of actual remedy and danger giving recommendation that contradicts greatest practices for emotional misery. For instance, some AI instruments have suggested people with OCD to proceed their compulsions, mistaking short-term aid for progress. Others have inspired avoidance of hysteria triggers, which can really feel useful initially however can worsen nervousness over time by reinforcing avoidance behaviours.
“There have additionally been well-documented, tragic instances the place AI instruments have given dangerously misguided recommendation and even inspired suicide – outcomes which might be each devastating and deeply alarming.
“Dad and mom and carers must be conscious that their kids could also be turning to AI for steering and recommendation. Whereas it’s necessary to maintain acceptable parental controls in place, open and sincere communication at house is simply as very important. Discuss to your kids with curiosity and share your issues in an age-appropriate method.
“Kids and adolescents aren’t but geared up to totally assess danger, so dad and mom play an important position in protecting them protected. Balancing privateness with security is rarely straightforward, however with out that stability, younger folks can turn into overly reliant on what’s finally a really good algorithm; one which lacks the moral and safeguarding requirements present in helplines, remedy, or school-based assist.
“Reaching for his or her telephones after they’re upset feels pure for a lot of younger folks, particularly as AI instruments can appear supportive and validating. This creates a priceless alternative for households to speak about their relationship with telephones and expertise. Dad and mom will help by modelling wholesome behaviour – setting shared screen-free occasions and recognising after they themselves instinctively flip to their telephones. In any case, telephones have been designed to attach us, but when we’re not cautious, they’ll begin to substitute actual human connection.”
References:
All figures are from BACP’s annual Mindometer survey of its members. The overall pattern dimension was 2,980 therapists, and fieldwork was undertaken between 3 – 17 September 2025. The survey was carried out on-line.
* https://www.bbc.co.uk/information/articles/cp3x71pv1qno
