As our lives grow more and more digital and we spend more time interacting with suspended human chatbots, the line between human connection and machine simulation begins to blur.
Today, over 20% of Daters report using AI for things such as creating dates or conversations, per recent Match.com study. Some take it further by forming emotional bonds, including romantic relationships, with AI comrades.
Millions of people around the world are using AI comrades from companies such as Replika, Character AI and Nomi AI, including 72% of US adolescents. Some people have reported in love with general llms like Chatgpt.
For some, the tendency of bots dating is dystopian and unhealthy, a real version of the movie “Her” and a signal that authentic love is replaced by the code of a technology. For others, AI comrades are a salvation, a way to feel and support in a world where human intimacy is increasingly difficult to find. A recent study found that a fourth young adult Think that AI relationships could soon replace people completely.
Love, it seems, is no longer strictly human. The question is: Should it be? Or can he date an AI to be better than a man’s dating?
This was the topic of discussion last month at an event I attended in New York, which hosted Open to Debate, a non -profit organization based on discussion. TechCrunch has received exclusive access to publish the full video (which includes the question of the challengers, because I am a journalist and I can’t help myself!).
Journalist and director Nayeema Raza have gathered the discussion. Raza was a former executive producer of “On With Kara Swisher” and is the current host of “Smart Girl Dumb Squests”.
TechCrunch event
Francisco
|
27-29 October 2025
Batting for the ai Companions was Thao Ha, Associate Professor of Psychology at the State University of Arizona and co -founder of Modern Love Collective, where it supports technologies that enhance our ability to love, empathy and prosperity. In the discussion he argued that “AI is an exciting new form of connection … not a threat to love, but a development of it”.
The reproduction of the human connection was Justin Garcia, executive director and senior scientist at the Kinsey Institute and lead scientific adviser to match.com. It is an evolutionary biologist that focuses on the science of gender and relationships and his upcoming book is entitled “The Home Animal”.
You can watch the whole subject here, but read to get a sense of the main arguments.
Always there for you, but is it good?
HA says that AI comrades can provide people with emotional support and validation that many cannot get in their human relationships.
“AI listens to you without his ego,” Ha said. “He adapts without judgment. He learns to love in ways that are consistent, responding and perhaps even safer. They understand you in ways that no one else ever has to wait to connect again for your thoughts. It can make you laugh and surprise you and surprise you.
Ask the audience to compare this level always toward the “Forkible Ex or perhaps to your current partner”.
“The one who sighs when you start talking, or the one who says,” I hear “, without looking up while they continue to move on their phone,” he said. “When was the last time they asked you how you were. What do you feel. What do you think?”
Ha admitted that since AI is not conscious, she does not claim that “AI can love us authentic”. This does not mean that people do not have the experience to love AI.
Garcia has encountered that it is not really good for people to have constant validation and attention, relying on a machine that has been asked to answer in ways you like. This is not “a honest indicator of a dynamic relationship,” he argued.
“This idea that AI is going to replace the ups and clutter of the relationships we want? I don’t think so.”
Educational wheels or replacement
Garcia noted that AI’s comrades can be good training wheels for some people, such as neurotransmitters, who may have anxiety to go on dates and practice how to flirt or resolve conflicts.
“I think if we use it as a tool for building skills, yes … this can be quite useful for many people,” Garcia said. “The idea that this becomes the permanent relationship model? No.
According to a match.com Study Singles in Americareleased in June, almost 70% of people say they would consider infidelity if their partner was involved with an AI.
“Now I think on one side, that goes [Ha’s] Point, that people say this is real relationships, “he said.” On the other hand, it goes to my point that they are threats to our relationships. And the human animal does not tolerate threats to their relationships during the long year. ”
How can you love something you can’t trust?
Garcia says trust is the most important part of any human relationship and people do not trust AI.
“According to a recent poll, one -third of Americans believes AI will destroy humanity,” Garcia said, noting that a recent Yougov poll found that 65% of Americans had little confidence in AI to make ethics.
“A bit of a danger may be exciting for a short -term relationship, a one night attitude, but generally you don’t want to wake up next to someone you think can kill you or destroy society,” Garcia said. “We cannot thrive with a person or organization or bot we don’t trust.”
HA has faced that people tend to trust their AI comrades in ways similar to human relationships.
“They trust it with their lives and the most familiar stories and emotions they have,” Ha said. “I think on a practical level, AI will not save you now when there is a fire, but I think people trust AI in the same way.”
Touch and sexuality
AI comrades can be a great way for people to play their most intimate, vulnerable sexual fantasies, Ha said, noting that people can use sex or robots to see some of these fantasies.
But it is not a substitute for the human touch, which Garcia says we are biologically planned to need and want to. He noted that, because of the isolated, digital age we are in, many people feel “hunger” – a condition that happens when you do not get as natural touch as you need, which can cause stress, anxiety and depression. This is due to the fact that involvement in a pleasant touch, such as a hug, makes your brain release oxytocin, a sense of good hormone.
HA said it has tried human contact between couples in virtual reality using other tools, such as costumes.
“The touch potential in VR and also AI is huge,” Ha said. “The abstract technologies that are developing are actually thriving.”
The dark side of the imagination
Violent violence in a partner is a problem all over the world and much of AI is trained in this violence. Both HA and Garcia agreed that AI could be problematic, for example, enhancing aggressive behaviors – especially if this is a fantasy that one plays with their AI.
This concern is not unfounded. Multiple studies have shown that men watching more pornography, which may include violent and aggressive sex, are more likely to be sexually aggressive With real life partners.
“Working from one of Kinsey’s colleagues, Ellen Kaufman, has examined this exact issue of consent and how people can train their chatbots to enhance the non -consensual language,” Garcia said.
He noted that people use AI comrades to experiment with good and bad, but the threat is that you can end up training people on how to be aggressive, non -consensual partners.
“We have enough of it in society,” he said.
HA believes that these risks can be mitigated with careful regulation, transparent algorithms and moral design.
Of course, he made this comment before the White House released the AI action plan, which says nothing about transparency – which many AI companies are against – or moral. The plan also seeks to eliminate many regulations around AI.
