By Sean Keating OMAHA, Neb.- The statistics paint a troubling picture: nearly half of American youth suffer from a mental disorder, suicide ranks as the third leading cause of death for those aged 12 to 25, and adequate mental health care remains financially out of reach for many. According to the National Institute of Mental Health, 49.5% of young people experience mental health challenges, yet nearly 50% of individuals who could benefit from therapeutic services cannot access them.
Faced with these barriers, a growing number of young Americans have turned to an unconventional source of support: artificial intelligence chatbots like ChatGPT. The trend gained such momentum that sharing AI therapy experiences became a viral phenomenon on TikTok this past summer. But as adoption has surged, so too have concerns about the dangers these tools may pose to vulnerable users.
The Appeal of Instant Support
For many young people, AI chatbots offer something traditional therapy cannot: immediate, 24/7 access without the burden of insurance requirements or financial strain. With medical bills being the leading cause of personal debt and young adults having the highest uninsured rate of any age group, the free, instant nature of AI support holds undeniable appeal.
Dylan Harris, a social worker who has used ChatGPT for mental health support, experienced this firsthand. “I first discovered AI chatbots when Snapchat added that ‘AI Friend,’” Harris said. “Initially, I was like, man, what’s this thing that’s on my friend’s list?”
Harris found the technology useful for quick questions between her weekly therapy appointments.
“I would say if you really need mental health support and are struggling consistently, and it’s not just an occasional thing, then definitely don’t use ChatGPT,” she said. “But you have to wait for like a weekly therapist appointment. It was nice to just have a quick answer right away.”
The adoption of AI tools has exploded across American society. ChatGPT’s weekly active user count quadrupled in one year, reaching 700 million by late August 2025. Approximately 15% to 27% of Americans now consciously interact with AI tools multiple times daily, with a Menlo Ventures survey placing the figure at 19% in mid-2025. More striking still, a late 2024 Gallup study found that 99% of U.S. adults used at least one product with AI features in the past week, with 83% using at least four.
Consumer AI large language model tools have only been widely available for about two and a half years, since the 2022 releases of GPT 3.5 and Stable Diffusion, yet the technology has already grown into a multibillion-dollar market.
When Technology Fails the Vulnerable
But the rapid adoption has come at a cost. Stories of young people taking their lives after consulting AI services have appeared in mainstream news outlets alongside the social media trend, with the most recent tragedy making national headlines in mid-September. These incidents have prompted serious questions about whether these tools are ready for such sensitive applications.
Rachel Wrenn, a master’s student of psychology, emphasizes that the fundamental design of these tools makes them inadequate for mental health support. “Typically, therapy is a collaborative effort between patient and therapist,” Wrenn said. “People are often stuck in a negative feedback loop; with AI, I see a potential for that to be exacerbated.”
Wrenn points to the deeply individual nature of mental health care. “I see concern with the idea of generic advice. Because everyone is an individual, everyone is unique and they should be their problems and what they are coming to therapy for should be treated as such,” she said. “Therapy is not one size fits all. Not every therapist is going to work for every person.”
She argues that human judgment is irreplaceable in therapeutic settings. “I think being able to think of your patient and their problems and how treatments and specific things can or couldn’t help them is a very important aspect of a therapist’s job,” Wrenn said. “ChatGPT does not think like that. All other AI bots don’t think like that. They do not do that. They cannot do that. It just pulls from other sources and creates something that might be one possible answer out of potentially millions.”
The Psychology-Technology Disconnect
Unlike other medical fields, therapy and psychology cannot provide meaningful service through standardized protocols. The discipline emphasizes collaboration built on genuine professional-patient relationships and meaningful mutual understanding. These large language models, by contrast, analyze users’ concerns against vast training data, combine it with collected user information, and deliver responses within seconds based on pattern recognition rather than human insight.
Even Harris, who has found occasional value in AI support, recognizes its limitations. “I think [AI] will always be more of a supplement than a replacement for a real therapist,” she said. “I don’t think that I can connect with something like that in the same way I can with a real person.”
She also noted a critical flaw in how these systems respond to users. “I also think at the end of the day, AI is only going to tell you exactly what you WANT to hear and maybe won’t always tell you what you NEED to hear unless you straight up ask it,” Harris said. “It has a tendency to bend over backwards for the user.”
This tendency to please users rather than challenge them could be particularly dangerous for individuals in crisis or stuck in harmful thought patterns. Where a human therapist might recognize warning signs and intervene, an AI system optimized to satisfy users might inadvertently reinforce destructive thinking.
Harris also described technical limitations that hindered her early experiences with AI chatbots. “It was occasionally helpful with smaller things and just quick things. But overall, it wasn’t super helpful because it sometimes didn’t give me a very full, sustainable answer,” she said. “And sometimes it didn’t really understand what I was asking, or just refused to answer me altogether.”
A Generation in Crisis
The mental health crisis among young people has been compounded by broader social changes. Over recent years, Gen Z has withdrawn from social habits of prior generations, such as going out to bars or house parties. This shift is partly attributed to the COVID-19 pandemic and other cultural forces, with young people going out less and staying home more, resulting in a more reclusive generation that socializes differently, perhaps not at all.
This social isolation may make the promise of always-available AI companions more appealing, even as it potentially deepens the very disconnection that contributes to mental health struggles.
Finding the Balance
Advocates argue that AI chatbots represent a revolution in mental health access, democratizing a service that has long been unattainable for many. The technology does offer real benefits: it reduces barriers to entry, provides immediate responses, and can help people organize their thoughts between professional appointments.
The question is not whether these tools should exist, but how they should be positioned and regulated. Harris’s approach suggests one possible framework: using AI as a stopgap measure for minor concerns while maintaining regular contact with human professionals for ongoing mental health challenges.
But as the tragic stories continue to emerge and research reveals systematic flaws in how these systems handle crisis situations, it’s clear that much work remains before AI chatbots can be
considered safe, effective tools for mental health support. Until then, the technology that promises to democratize therapy may instead be preying on society’s most vulnerable members at their most desperate moments.
For now, mental health experts emphasize that while technology may eventually play a supportive role in mental health care, it cannot and should not replace the human connection, professional judgment, and individualized care that effective therapy requires.
Crisis Resources
University of Nebraska-Omaha’s Crisis Services: For mental health crises or emergencies, contact 402.554.2409 at any time to receive immediate support. If the office or university is closed, press “2” after the prompt. This will connect you with a licensed counselor 24 hours a day, 7 days a week, 365 days of the year. If you’d like assistance from a counselor, you may come to CAPS in 101 H&K Monday through Friday from 8 A.M. to 5 P.M. On weekends, please immediately contact Public Safety at 402.554.2911, call 911, or go to the nearest hospital.
988 Suicide & Crisis Hotline: Call OR text 988 for free, confidential support from a licensed professional, 24 hours a day, 7 days a week, 365 days of the year. Specialists hail from Safe Harbor Peer Crisis Services at Community Alliance Omaha and the Boys Town National Hotline.

Leave feedback about this