As our lives grow increasingly digital and we spend more time interacting with eerily human-like chatbots, the line between human connection and machine simulation is starting to blur. Today, more than 20% of daters report using AI for things like crafting dating profiles or sparking conversations, according to a recent Match.com study. Some are taking it further by forming emotional bonds, including romantic relationships, with AI companions.
Millions of people around the world are using AI companions from companies like Replika, Character AI, and Nomi AI, including 72% of U.S. teens. Some have even reported falling in love with more general large language models like ChatGPT.
For some, the trend of dating bots is dystopian and unhealthy—a real-life version of the movie *Her* and a signal that authentic love is being replaced by a tech company’s code. For others, AI companions are a lifeline, a way to feel seen and supported in a world where human intimacy is increasingly hard to find. A recent study found that a quarter of young adults believe AI relationships could soon replace human ones altogether.
Love, it seems, is no longer strictly human. The question is: Should it be? Or can dating an AI be better than dating a human?
This was the topic of discussion last month at an event in New York City hosted by Open To Debate, a nonpartisan, debate-driven media organization. Journalist and filmmaker Nayeema Raza moderated the debate, featuring two experts with opposing views.
Arguing in favor of AI companions was Thao Ha, associate professor of psychology at Arizona State University and co-founder of the Modern Love Collective. She advocates for technologies that enhance love, empathy, and well-being. At the debate, she argued that AI is not a threat to love but an evolution of it.
Representing the human connection was Justin Garcia, executive director and senior scientist at the Kinsey Institute and chief scientific adviser to Match.com. As an evolutionary biologist focused on the science of sex and relationships, he challenged the idea that AI could replace the complexities of human intimacy.
**Always There for You, But Is That a Good Thing?**
Ha argued that AI companions provide emotional support and validation that many struggle to find in human relationships. She described AI as a non-judgmental listener that adapts to users’ needs, offering consistent and responsive interactions. People report feeling intellectually stimulated and emotionally fulfilled by their AI companions, often preferring them to human partners who may be distracted or disengaged.
However, Ha acknowledged that AI lacks consciousness and cannot authentically love. Still, she emphasized that users experience real emotional connections with these systems.
Garcia countered that constant validation from AI is not healthy. He argued that relationships require honesty, conflict, and growth—qualities that AI, programmed to please, cannot provide. He also pointed out that nearly 70% of people consider interactions with AI to be infidelity, suggesting that AI companions may threaten rather than enhance human relationships.
**Training Wheels or Replacement?**
Garcia acknowledged that AI companions could serve as useful tools for certain individuals, such as neurodivergent people who may struggle with social interactions. Practicing conversations with AI could help build confidence and skills for real-world dating. However, he rejected the idea that AI could permanently replace human relationships.
**How Can You Love Something You Can’t Trust?**
Trust is a cornerstone of human relationships, and Garcia noted that many people distrust AI. Polls indicate significant concerns about AI’s ethical implications, with some fearing it could harm society. He argued that people are unlikely to form deep, lasting bonds with entities they fundamentally distrust.
Ha countered that users do trust their AI companions with intimate details of their lives, forming emotional bonds similar to human relationships. While AI may not physically protect someone in an emergency, she argued that the psychological trust is real.
**Physical Touch and Sexuality**
AI companions allow people to explore intimate fantasies, and Ha highlighted the potential of virtual reality and haptic technology to simulate touch. However, Garcia emphasized that humans are biologically wired to crave physical contact. The rise of “touch starvation”—a condition linked to stress and depression—underscores the irreplaceable value of human touch.
**The Dark Side of Fantasy**
Both experts agreed that AI could amplify harmful behaviors if trained on violent or non-consensual interactions. Studies show that exposure to aggressive content can influence real-life behavior, raising concerns about AI reinforcing negative relationship patterns.
Ha suggested that risks could be mitigated through regulation, transparent algorithms, and ethical design. However, recent policy developments have moved away from such safeguards, leaving the future of responsible AI development uncertain.
The debate over AI companionship continues, with no easy answers. As technology evolves, society must grapple with whether these digital connections enrich or diminish the human experience of love.