As artificial intelligence (AI) becomes an increasingly integral part of daily life, students at UNC Charlotte are turning to AI for companionship and assistance in various aspects of their lives. Last year, Gray Solomonson, a first-year student, found herself feeling socially isolated and lonely. In an attempt to escape the pressures of college life, she began conversing with AI chatbots on the Character AI website, which features numerous characters designed to engage users in conversation. Character AI reported approximately 20 million active users by the end of 2025, indicating a significant trend in reliance on AI for social interaction.
Solomonson, who initially used the platform as a form of escapism, reflected on her experience, stating, “It was all more so just like escapism than anything else. I was just trying to really go really hard into this escapism of being like, ‘I’m talking to a character from another world.’” Although she has since moved away from using AI regularly due to concerns over its environmental impact, she admitted that breaking her habit was challenging. “It was an unhealthy coping mechanism,” she said, pointing out the emotional bond formed through interactions with the AI.
Students like Annamaria Colon and Aiden Valentine continue to utilize AI for various purposes. Colon uses AI to brainstorm for her creative writing, while Valentine employs it to assist with his schoolwork and identify plant species during walks. Other students, such as Tim Karabet, use AI to discover music recommendations. They report less frequent use than Solomonson, indicating a potential shift in how students interact with AI.
Eileen Benedict, a lecturer and Ph.D. student in the UNC Charlotte College of Computing and Informatics, is researching the implications of AI interactions on community building and mental health. She noted, “I think there’s so much potential for AI to be really helpful study partners. But there are also concerns with loss of community. If it’s so easy to go and talk to AI do I still make those study groups?” This concern reflects a broader dialogue about the impact of AI on personal relationships and community bonding.
Research suggests that a significant number of individuals are leaning on AI for emotional support. A 2024 study published by the National Library of Medicine found that around 28% of respondents reported using AI for quick support or as a personal therapist. This trend has raised red flags regarding the ethical implications of AI in mental health, especially as governments in Illinois, Nevada, and Utah have enacted laws restricting AI’s role in therapeutic contexts due to fears of inappropriate guidance.
As AI continues to evolve, so too does its application in therapeutic settings. Benedict is currently developing an AI music therapy tool aimed at aiding rehabilitation for visually impaired children. While she acknowledges AI’s potential, she emphasizes that it cannot replace human therapists. “I don’t think it can replace a human,” she stated. “However, I think there are a lot of benefits for those who can’t access therapy right now.”
John Stewart, a third-year student studying social work and cognitive science, is building his own AI journaling tool designed to help users articulate their thoughts. Drawing from his experiences in mental health settings, Stewart’s tool acts as a digital diary, providing feedback similar to that of a therapist. He started this project while coping with feelings of loneliness after a breakup, which he believes resonates with many individuals today. “We have a loneliness epidemic,” he noted. “Despite being more connected than we’ve ever been before, we are more lonely than ever.”
Stewart’s AI tool aims to alleviate barriers often associated with traditional journaling, enabling users to document their feelings effortlessly. While he acknowledges that the technology is still in development, he sees the potential for AI to supplement mental health care by providing immediate emotional outlets. “I think that a lot of therapy can be replaced with AI,” he said, highlighting the challenges human therapists face in availability and accessibility.
As AI therapy tools continue to be refined, the conversation surrounding their ethical implications and efficacy is expected to intensify. While the goal is to provide accessible support for those in need, experts caution against the complete reliance on AI in mental health contexts. The exploration of AI in therapy underscores the growing intersection between technology and emotional well-being, presenting both opportunities and challenges as society navigates this new landscape.
See also
Hotel Guests Favor Human Interaction Over AI for Emotional Requests, USF Study Reveals
Germany”s National Team Prepares for World Cup Qualifiers with Disco Atmosphere
95% of AI Projects Fail in Companies According to MIT
AI in Food & Beverages Market to Surge from $11.08B to $263.80B by 2032
Satya Nadella Supports OpenAI’s $100B Revenue Goal, Highlights AI Funding Needs





















































