Psychology Experts Discuss Rising Use of AI Chatbot Therapy in Teens
Skip to main content

AI Chatbot Therapy 
PCOM Psychology Weighs In on Benefits, Boundaries and Rising Use in Teens


January 20, 2026
A teenager using an AI chatbot for therapy related questions on their phone screen

More than half of global AI users report turning to AI for emotional or mental well-being, and among teens, the trend is even more striking: over 70% have used AI chatbots, and more than 50% use them regularly for emotional support.

PCOM professors in the School of Professional and Applied PsychologyScott Glassman, PsyD; Lisa Corbin, PhD, LPC, NCC; and Jessica Kendorski, PhD, NCSP, BCBA-D – are examining what this shift means for mental health and adolescent development and what users need to know when utilizing the tool.

AI as a First Step — Not a Replacement for Therapy

For individuals who face barriers to mental health care — including cost, stigma or long waitlists — AI tools can provide immediate, low-pressure support.

“AI can open the door, but it can't walk you through it the way a trained therapist can,” said Glassman, director of the Master of Applied Positive Psychology program.

Glassman describes AI chatbots as a form of emotional triage, helping users identify emotions and practice brief strategies such as grounding exercises, cognitive reframing and self-compassion. These tools may help people feel momentarily supported or less alone, but they lack the depth and nuance of human connection.

“AI can sound empathetic, but empathy isn't just language — it's a felt connection between two people,” Glassman said.

He notes that while chatbots can model reflective listening and affirming responses, that artificial empathy is detached from real relationships. For some individuals, especially those who struggle to form or maintain healthy social connections, overreliance on AI may reduce motivation to seek meaningful human support.

Corbin, department chair and director of the Master of Science in Mental Health Counseling program, agrees that AI should function as a bridge to professional care — not a substitute. She points to AI's usefulness in helping users understand different therapy options, prepare for an initial appointment or locate licensed providers.

“AI may help someone take the first step,” Corbin said. “But real healing still happens within a human relationship.”

Why Teen Use Raises Additional Concerns

Teens often turn to AI during moments of vulnerability, and chatbots embedded into familiar platforms can feel validating, responsive and personal.

Kendorski, department chair and director of the MS and certificate programs in Applied Behavior Analysis, notes that this familiarity can create a false sense of connection. “The interaction may feel warm or caring,” she said, “but the chatbot isn't forming a real relationship.”

Glassman adds that certain individuals may be especially vulnerable to misunderstanding or misusing AI tools, particularly if they begin to view chatbots as a primary source of emotional support. When safeguards fail or boundaries blur, the risks can be significant.

Kendorski encourages parents to take an active role by explaining how AI works, setting limits around use and checking in regularly about emotional reliance.

“Teens don't need to fear AI,” she said. “They need to understand its purpose — and its limits.”

Where AI May Fit — With Care

Glassman sees the most appropriate role for AI within a positive psychology framework, particularly for adults without clinical mental health disorders who are looking to reinforce healthy habits or well-being goals. Even then, he stresses the importance of caution and oversight.

He also emphasizes the need for greater involvement from mental health professionals in the development of AI tools.

“For technology that affects emotional well-being this deeply, therapists need a stronger voice in how it's designed, tested and monitored,” Glassman said.

While AI may offer moments of guidance or reflection, it cannot replace the care of a licensed clinician; those seeking professional mental health support are encouraged to explore services available through PCOM's Psychological Services Center.

You May Also Like:

About Philadelphia College of Osteopathic Medicine

Established in 1899, Philadelphia College of Osteopathic Medicine (PCOM) has trained thousands of highly competent, caring physicians, health practitioners and behavioral scientists who practice a “whole person” approach to care—treating people, not just symptoms. PCOM, a private, not-for-profit accredited institution of higher education, operates three campuses (PCOM, PCOM Georgia and PCOM South Georgia) and offers doctoral degrees in clinical psychology, educational psychology, osteopathic medicine, pharmacy, physical therapy, and school psychology. The college also offers graduate degrees in applied behavior analysis, applied positive psychology, biomedical sciences, forensic medicine, medical laboratory science, mental health counseling, physician assistant studies, and school psychology. PCOM students learn the importance of health promotion, research, education and service to the community. Through its community-based Healthcare Centers, PCOM provides care to medically underserved populations. For more information, visit pcom.edu or call 215-871-6100.

Contact Us

For general media inquiries, please contact the Office of Marketing and Communications at 215-871-6300 or communications@pcom.edu. Visit our media relations page to view contact information for public relations personnel.

Connect with PCOM

Media Inquiries

Ally Wengel
Public Relations Manager
Office of Marketing and Communications
Email: allywe@pcom.edu
Office: 215-871-6325

X