AI companions: A threat to love, or an evolution of it?

As our lives become more digital and we spend more time on interaction with creepy human chatbots, the line between human connection and machinery starts to fade.
Nowadays, more than 20% of daters report with the help of AI for things such as making dating profiles or spark interviews, according to a recent Match.com Study. Some go further by forming emotional ties, including romantic relationships, with AI -companions.
Millions of people around the world use AI companions from companies such as Replika, Character Ai and Nomi Ai, including 72% of American teenagers. Some people have reported Falling in love with more general LLMs such as Chatgpt.
For some, the trend of datingbots is dystopian and unhealthy, a real version of the film “Her” and a signal that authentic love is replaced by the code of a technology company. For others, AI companions are a lifeline, a way to feel and supported you in a world where human intimacy is increasingly difficult to find. A recent study showed that A quarter of the young adults Think AI relationships could quickly replace people.
Love, it seems, is no longer strictly human. The question is: should it be? Or can it dating an AI better than dating a person?
That was the subject of discussion last month during an event that I attended in New York City, organized by Open to Debate, a non-party, debate-driven media organization. WAN got exclusive access to publish the full video (including me that I ask the debaters a question, because I am a reporter and I cannot help myself!).
Journalist and filmmaker Nayeema Raza moderated the debate. Raza was previously on-Air executive producer of the podcast “On With Kara Swisher” and is the current host of “Smart Girl Dumb questions”.
WAN event
San Francisco
|
27-29 October 2025
Batting for the AI-companions was Thao Ha, associate professor of psychology at Arizona State University and co-founder of the Modern Love Collective, where she argues for technologies that improve our ability to love, empathy and well-being. During the debate she argued that “Ai is an exciting new form of connection … not a threat to love, but an evolution.”
The human connection was Justin Garcia, executive director and senior scientist at the Kinsey Institute and main scientific adviser to Match.com. He is an evolutionary biologist focused on the science of sex and relationships, and his upcoming book is entitled ‘The Intimate Animal’.
You can view the whole thing here, but read on to get an idea of the most important arguments.
Always for you, but is that a good thing?
Ha says that AI branches can offer people the emotional support and validation that many cannot get in their human relationships.
“Ai listens to you without his ego,” said Ha. “It adjusts without judgment. It learns to love ways that are consistent, responsive and perhaps even safer. It understands you in a way that no one else ever has. It is curious enough about your thoughts, it can make you laugh, and it can even surprise you with a poem. People generally feel loved by their AI.
She asked the audience to always compare this level of attention with “your fallible ex or perhaps your current partner.”
“The one who sighs when you start talking, or the one who says,” I listen, “without looking up while they keep scrolling on their phone,” she said. “When did they ask you for the last time how you do it, what you feel, what you think?”
HA admitted that because AI has no consciousness, she does not claim that “AI can love us authentically.” That does not mean that people are not the experience From being loved by AI.
Garcia countered that it is not good for people to have constant validation and attention, to rely on a machine that has been asked to answer in a way you like. That is not “an honest indicator of a dynamic of a relationship,” he argued.
“This idea that AI is going to replace the ups and downs and the messiness of relationships we crave? I don’t think so.”
Training wheels or replacement
Garcia noted that AI companions can be good training wheels for certain people, such as neurodivergent people who may have fear of going on dates and practicing how to flirt or resolve conflicts.
“I think if we use it as an aid to build skills, yes … that can be very useful for many people,” said Garcia. “The idea that it will be the permanent relationship model? No.”
According to a Match.com Singles in America StudyReleased in June, almost 70% of people say they would consider unfaithful if their partner deals with an AI.
“Now I think on the one hand, that’s going to [Ha’s] Point, that people say these are real relationships, “he said.” On the other hand, it goes to my point that they are threats for our relationships. And the human animal does not tolerate threats for their long -term relationships. “
How can you love something that you can’t trust?
Garcia says that trust is the most important part of every human relationship, and people do not trust AI.
“According to a recent poll, a third of Americans think that AI will destroy humanity,” Garcia said, and noted that a recent Yougov survey discovered that 65% of Americans have little confidence in AI to make ethical decisions.
“A little risk can be exciting for a short-term relationship, a stand of one night, but you generally don’t want to wake up next to someone you think you could kill or destroy society,” Garcia said. “We can’t thrive with a person or an organism or a bone that we don’t trust.”
Ha went against that people tend to rely their AI companions in ways that are comparable to human relationships.
“They trust their lives and the most intimate stories and emotions they have,” said Ha. “I think AI at a practical level will not save you now if there is a fire, but I think people trust AI in the same way.”
Physical touch and sexuality
AI companions can be a great way for people to play their most intimate, vulnerable sexual fantasies, said Ha and noticed that people can use sex toys or robots to see some of those fantasies through.
But it is not a replacement for human touch, of which Garcia says we have been programmed organically and want. He noted that, because of the isolated, digital age in which we find ourselves, many people have had “starvation” – a condition that happens when you don’t get as much physical touch as you need, which can cause stress, anxiety and depression. This is because entering into a pleasant touch, such as a hug, lets your brain release oxytocin, a feel-good hormone.
HA said she tested the human touch between couples in virtual reality with the help of other tools, as possible Haptics suits.
“The potential of contact in VR and also connected to AI is huge,” said Ha. “The tactile technologies that are being developed are actually booming.”
The dark side of fantasy
Intimate partner violence is a problem all over the world, and much of AI is trained in that violence. Both HA and Garcia agreed that AI could be problematic in, for example, strengthening aggressive behavior – especially if that is a fantasy that someone plays with his AI.
That care is not unfounded. Multiple studies have shown that men who view more pornography can include violent and aggressive sex More chance to be sexually aggressive With real partners.
“Work of one of my Kinsey Institute colleagues, Ellen Kaufman, has looked at this exact issue of permission language and how people can train their chatbots to strengthen non-consensual language,” Garcia said.
He noted that people use AI companions to experiment with the good and bad, but the threat is that you can train people about how you can be aggressive, non-consensual partners.
“We have had enough of that in society,” he said.
HA thinks these risks can be limited with well -considered regulation, transparent algorithms and ethical design.
Of course she made that comment before the White House has released its AI action plan, which says nothing about transparency – where many border AI companies are against – or ethics. The plan also wants to eliminate a lot of regulations on AI.




