
News you can use
Crime & Scams
AI romantic companions are on the rise. They have red flags too
Current advances in AI technologies are marking a new era for intimate romantic and sexual relationships.

Canva Ai Image
By
The Conversation
7 July 2024
less than 3 min read
Over the past decade, virtual assistants powered by artificial intelligence, like Apple’s Siri and Amazon’s Alexa, have become integral to technologies such as smartphones and social media.
More recently, a new type of humanlike chatbots are on the rise: AI romantic companions. Chatbots are AI-powered programs that engage with humans through text, voice, and images.
Currently, over 100 AI-powered applications—such as myanima.ai, Eva AI, Nomi.AI, and Replika—offer romantic and sexual companions with extensive personalization options, including physical and personality features. Exhibiting remarkable realism, adaptability and interactive fluidity, these AI chatbots can progressively evolve through conversation, fine-tuning their responses to match users’ interests, needs and communication styles.
Modern AI chatbots have increasingly humanlike qualities that raise users’ propensity to engage and form emotional bonds—even to the point of falling in love.
Exacerbated by COVID-19 pandemic restrictions, loneliness has led more people to use AI as a substitute for counselors, friends, and romantic partners.
Romantic partner substitutes
Research shows AI chatbots can offer companionship, ease loneliness and boost positive emotions with supportive messages. Chatbots also provide a judgment-free space for open conversations and advice when other resources are scarce. People can also form intimate and passionate connections with AI that are similar to human relationships.
Surprisingly, there doesn’t seem to be a difference in enjoyment, sexual arousal, and emotional response whether participants believe they are interacting with a human or a chatbot. One study even showed that individuals feel a stronger emotional connection with chatbots compared to less-responsive humans during a conversation.
Research repeatedly suggests that humans can form genuine emotional bonds with AI, even if they acknowledge it is not a “real” person. Although many people appear to derive psychological benefits from using chatbots, the potentially harmful consequences of these relationships remain unclear.