As the use of artificial intelligence chatbots like ChatGPT, Claude, and Character.AI grows, a significant number of teens and adults are increasingly turning to these technologies for companionship and emotional support. This trend raises important questions for mental health care providers, who are encouraged to explore how their patients interact with AI, similar to inquiries about sleep, diet, and lifestyle habits. This insight comes from a recent study published in JAMA Psychiatry.
Shaddy Saba, an assistant professor at New York University’s Silver School of Social Work, emphasized that the goal is not to categorize AI use as either good or bad. “We’re not saying that AI use is good or bad,” he noted, drawing parallels to substance use or consulting friends for advice. Instead, understanding a patient’s engagement with AI could unveil critical aspects of their mental health and emotional well-being.
Saba and his co-authors recommend that therapists ask patients about their interactions with AI chatbots, as this can provide valuable context regarding their emotional states. Vaile Wright from the American Psychological Association echoes this advice, suggesting that discussing AI use can create a foundation for more effective therapeutic relationships. “It’s a chance to learn about things a client might not voluntarily share,” she said.
According to Saba, individuals are frequently utilizing chatbots to navigate stressful experiences and personal relationship challenges. “People are using these tools on a regular basis to ask about how to cope,” he explained. This prompts the potential for a rich exchange of information during therapy sessions, enhancing the therapist’s understanding of a client’s stressors and coping mechanisms.
The dialogue surrounding AI use also serves a practical purpose. For example, Saba suggests that exploring why a client may seek solace from a chatbot instead of addressing issues directly with loved ones can inform therapeutic strategies. “If a client is going to the chatbot to avoid confrontations, that information is crucial for their support,” he said.
Dr. Tom Insel, a psychiatrist and former director of the National Institute of Mental Health, adds that discussing AI can help identify sensitive topics that patients may be reluctant to disclose. For instance, suicidal thoughts might be difficult for a patient to share openly with a therapist, yet discussing interactions with a chatbot could pave the way for more candid conversations.
When broaching the subject, Saba advises therapists to approach the topic without judgment. “We don’t want to make clients feel like we’re judging them,” he explained. Instead, he recommends expressing genuine curiosity about their experiences. He suggests language such as, “AI is something that’s kind of rapidly growing, and I’m hearing from a lot of people that they’re using things like ChatGPT for emotional support. Is that the case for you?” This approach encourages patients to open up about their use of these technologies.
Insel notes that AI chatbots could complement therapeutic practices, allowing clients to vet topics for discussion or vent about daily life stressors. However, he warns that treating a chatbot as a substitute for therapy can be problematic. “Talking with a chatbot about one’s mental health is the opposite of therapy,” he said, emphasizing that therapy is designed to challenge and facilitate meaningful conversations, often addressing difficult issues.
In a practical application of these insights, psychologist Cami Winkelspecht, who works primarily with children and adolescents, has begun to consider incorporating questions about social media and AI usage into her intake forms. As more families seek advice on navigating the ethical use of AI in academic settings, Winkelspecht recognizes the necessity for mental health professionals to engage with clients about their digital habits more holistically. “We don’t necessarily think about what they’re doing with their phones quite as much,” she remarked, advocating for more comprehensive discussions on technology use.
This growing intersection between mental health care and AI technology highlights the importance of understanding how these tools impact emotional well-being. As AI continues to evolve and permeate daily life, both therapists and patients may find new avenues to enhance their conversations and therapeutic experiences, ultimately fostering a better understanding of emotional health in the digital age.
See also
Germany”s National Team Prepares for World Cup Qualifiers with Disco Atmosphere
95% of AI Projects Fail in Companies According to MIT
AI in Food & Beverages Market to Surge from $11.08B to $263.80B by 2032
Satya Nadella Supports OpenAI’s $100B Revenue Goal, Highlights AI Funding Needs
Wall Street Recovers from Early Loss as Nvidia Surges 1.8% Amid Market Volatility


















































