Can chatbots replace human therapists? Some startups — and patients — claim they can. But it’s not exactly science.
A study found that 80% of people who have used OpenAI’s ChatGPT for mental health counseling consider it a good alternative to regular therapy, while a separate report found that chatbots can be effective in reducing some symptoms related to depression and anxiety. Secondly, it is established that the relationship between therapist and client—the human connection, in other words—is among the best predictors of success in mental health treatment.
Three entrepreneurs — Dustin Klebe, Lukas Wolf and Chris Aeberli — are in the pro-chatbot therapy camp. Their launch, Soniaoffers an “artificial intelligence therapist” that users can talk to or message through an iOS app about a range of topics.
“To some extent, building an AI therapist is like developing a drug, in the sense that we’re building a new technology as opposed to repackaging an existing one,” Klebe, Sonia’s CEO, told TechCrunch.
The three met in 2018 while studying computer science at ETH Zurich and moved to the US together to pursue graduate studies at MIT. Shortly after graduation, they reunited to launch a startup that could encapsulate their shared passion for scalable technology.
This startup became Sonia.
Sonia leverages a range of generative AI models to analyze what users say during ‘therapy sessions’ in the app and respond to them. Applying techniques from cognitive behavioral therapy, the app, which charges users $20 a month or $200 a year, gives “home” information aimed at driving home conversations and visualizations designed to help identify top stressors factors.
Klebe claims that Sonia, which is not FDA approved, can treat issues ranging from depression, stress and anxiety to relationship problems and poor sleep. For more serious scenarios, such as people having violent or suicidal thoughts, Sonia has “additional algorithms and models” to detect “emergency situations” and direct users to national hotlines, Klebe says.
Somewhat disturbingly, none of Sonia’s founders have a background in psychology. But Klebe says the startup is consulting psychologists, recently hired a cognitive psychology graduate, and is actively hiring a full-time clinical psychologist.
“It’s important to emphasize that we do not consider human therapists or any companies that provide physical or virtual human-delivered mental health care as competition,” Klebe said. “For every response that Sonia generates, there are approximately seven additional language model calls that occur in the background to analyze the situation from many different therapeutic perspectives in order to adapt, optimize and personalize Sonia’s chosen therapeutic approach.”
What about privacy? Can users be sure that their data is not kept on a vulnerable cloud or was it used to train Sonia’s models without their knowledge?
Klebe says Sonia is committed to storing only the “bare minimum” amount of personal information to administer treatment: a user’s age and name. However, it did not say where, how or for how long Sonia stores chat data.


Sonia, which has about 8,000 users and $3.35 million in backing from investors including Y Combinator, Moonfire, Rebel Fund and SBXi, is in talks with anonymous mental health organizations to provide Sonia as a resource through their online gates. Reviews for Sonia on the App Store have been quite positive so far, with several users noting that it’s easier to talk to the chatbot about their problems than a human therapist.
But is this a good thing?
Today’s chatbot technology is limited in the quality of advice it can provide — and may not pick up on more subtle signs of a problem, like an anorexic asking how to lose weight. (Sonya didn’t even know the person’s weight.)
Chatbot responses are also colored by biases — often Western biases are reflected in their training data. As a result, they are more likely to miss cultural and linguistic differences in how a person expresses mental illness, particularly if English is that person’s second language. (Sonia only supports English.)
In the worst case scenario, chatbots go off the rails. Last yearthe National Eating Disorders Association came under fire for replacing humans with a chatbot, Tessa, that provided thought-provoking weight loss advice to people with eating disorders.
Klebe emphasized that Sonia is not trying to replace human therapists.

“We’re building a solution for the millions of people who struggle with their mental health but can’t (or won’t) have access to a human therapist,” Klebe said. “We intend to fill the gigantic gap between demand and supply.”
There is definitely a gap – both in terms of the ratio of professionals to patients and the cost of treatments relative to what most patients can afford. More than half of the US does not have adequate geographic access to mental health care, according in a recent government report. And a recent one overview found that 42% of US adults with a mental health condition were unable to receive care because they could not afford it.
An article in Scientific American talks about treatment apps targeting the “worried well,” or people who can afford treatment and app subscriptions, rather than individuals who may be most at risk but don’t know how to seek help. At $20 a month, Sonia isn’t exactly cheap—but Klebe argues that it’s cheaper than a typical therapy appointment.
“It’s much easier to get started with Sonia than to see a human therapist, which involves finding a therapist, waiting on a four-month waiting list, going there at a set time and paying $200,” she said. “Sonia has already seen more patients than a human therapist would in their entire career.”
I just hope the Sonia founders stay transparent about the issues the app can and can’t face as they develop it.
