‘How do you use AI?’ Therapists Should Ask You That Question, Experts Argue

Saba and his co-author’s recommendations are “very consistent” with the recommendations of the American Psychological Association (APA) in a health advisory issued in November of last year, said Vaile Wright of the APA.
Asking what the patient is getting from their conversations with the AI chatbot sets “the foundation for the therapist to better know how to try to deal with their emotional health and mental illness,” Wright said.
“Repository”
“People use these tools regularly to ask how they can deal with stressful situations, challenges in personal relationships,” explains Saba.
And some use chatbots to get advice on how to deal with symptoms of anxiety and depression.
“To the extent that we can encourage our clients to bring these conversations, with increasing detail, even in the treatment room, I think there is a possibility of a store of information,” he said.
It could be information about the main causes of stress in someone’s life, or if they turn to a chatbot as a way to avoid conflict.
“Let’s say, for example, you have a client who is having relationship problems with their spouse,” says APA’s Wright. “And instead of trying to have open conversations with their spouses about how to meet their needs, they instead go to a chatbot to fill those needs or avoid having these difficult conversations with their spouses.”
That database will help the therapist better support the patient, he explains.
“Helping them understand how to safely communicate with their spouses, helping them understand the limitations of AI as a tool to fill those gaps in those needs.”
Discussing the use of AI is also an opportunity to learn about things the client may not willingly share with the therapist, says psychiatrist Dr. Tom Insel, former director of the National Institute of Mental Health. “People often use chatbots to talk about things they can’t talk about with other people because they are too worried about being judged,” he said.
For example, suicidal thoughts may be something a patient is reluctant to share with their therapist, but it is important for the therapist to know how to keep the patient safe.
Be curious, but don’t judge
When it comes to starting conversations with patients, Saba suggests doing it without judgment.
“We don’t want to make customers feel like we’re judging them,” he said. “They won’t want to work with us normally if we do that.”
He recommends that therapists approach the subject with genuine curiosity, and provide suggestive language in these conversations.
“‘You know, AI is a fast-growing thing, and I’m hearing from a lot of people that they’re using things like ChatGPT for emotional support,'” he suggests. “‘Is that the case with you? Have you ever tried that?’
And he recommends asking specific questions about what they found helpful to better understand how the patient is using these tools.
It can also help a therapist see if a chatbot can complement therapy in useful ways, Insel said, such as checking what topics should be brought up in her session or talking about everyday life.
In some ways, therapy and chatbots can be “aligned to work together,” Insel said.
Saba and his co-author, William Weeks, also suggest asking patients if any chatbot interactions they find unhelpful or problematic, and also offer to share the risks of using chatbots for emotional support.
For example, the risk of data privacy, because many AI companies use interviews – even sensitive ones – to continue training their models.
There are also risks to treating a chatbot as a therapist, Insel said.
Talking to a chatbot about a person’s mental health is “the opposite of therapy,” he says, because chatbots are designed to reassure and flatter, reinforcing the thoughts and feelings of users.
Insel says: “Therapy is there to help you change and challenge you, and get you to talk about the most difficult things.”
Accepting advice
Psychologist Cami Winkelspecht has a private practice working primarily with children and youth in Wilmington, Del.
She was considering adding questions about social media and using AI to her feed and appreciated Saba’s tutorial as it provided sample questions to include.
Over the past year or so, Winkelspecht has had a growing number of clients and their parents ask him for help using AI to discuss thoughts and activities in ways that don’t violate the school’s honor code. Therefore, he had to adapt to the technology in order to be able to support his customers. Along the way, she realized that therapists and parents of children need to know more about how children and young people use their digital devices – both social media and AI chatbots.
![The eLearning Industry’s Guest Author Article Showcase [March 2026] The eLearning Industry’s Guest Author Article Showcase [March 2026]](https://cdn.elearningindustry.com/wp-content/uploads/2026/04/eLearning-Industrys-Guest-Author-Article-Showcase-March-2026.jpg)


