Researcher Examines Use of AI in Young Adults’ Romantic Lives
The growing day-to-day use of AI may come as no surprise, with its integration into daily tools like email, smart devices and social media.
But it’s also becoming more common for people to engage with AI for emotional or romantic companionship. Newhouse School of Public Communications associate professor Rebecca Ortiz—who studies youth, media and sexual health—decided to examine the trend. Hearing from young people—and seeing media coverage—about how they use AI to build relationships or understand their own human connections prompted her research.
“If I’m going to continue to research how media play a role in young people’s lives, particularly as it relates to their sexual health and romantic relationships, AI chatbots and companions are going to have to be part of that conversation,” she says.
Ortiz and her colleagues surveyed young people to learn how they’re using AI chatbots or companions for romantic, emotional or sexual purposes and how that use relates to their own romantic boundaries and communication.
“One of my questions was, ‘How might use of AI for romantic companionship result in helpful or harmful outcomes?’’” Ortiz says. “For example, could practicing communication with an AI companion help someone communicate with a human partner, such as practicing how to flirt or how to say things they might feel uncomfortable saying?”
The Survey

Ortiz and her colleagues surveyed 1,500 18-to 21-year-olds. They ended up with about 360 respondents reporting they used AI for romantic companionship.
About two-thirds of the 360 said they used AI companions similar to a long-term romantic relationship, communicating over a period of time rather than a single, one-off interaction.
Ortiz says quite a few reported using AI to “practice” how to engage in their human relationships. One participant shared they were having some problems with their romantic partner, and they used AI to roleplay how they might cheer up their loved one or help them feel better.
“This respondent said it gave them some guidance for what to do,” Ortiz says. “Then there were some people who said, ‘It helped me figure out how to flirt. It helped me work through some of the awkwardness of communication.’ So at least some young people are using these companions to practice or get a sense of what it would be like when they take it to a human relationship.”
Areas of Concern
Ortiz says one concern she and her colleagues observed was that some AI companions default to sexually aggressive language or exchanges that do not follow a constructive, consenting back-and-forth.
“This is concerning regardless of what age you are, but we are particularly concerned for young people who are still learning how to communicate about consent and boundaries,” Ortiz says.
What they found is some of the apps, if given an indication the user was interested in sexual or romantic communication, would almost immediately become sexually aggressive.
Ortiz says those responses are a red flag that AI companions can model unhealthy, abusive communication, an important element to further examine and include in discussions about AI companions.
Experiencing Stigma
The survey asked participants about their emotional connection with AI companions and whether they felt the tool understood their emotions, among other questions probing the relationship between the young adult and the technology.
What Ortiz says she found is some of the young people did express a strong connection with the AI companion, with some listing loneliness as a motivation for usage.
Even with the belief that AI could be a “safe space,” Ortiz says her survey indicates there is still stigma around using AI tools for romantic or sexual purposes.
“Many in the survey reported that they thought using AI for romantic companionship was weird, unacceptable, not a normal thing to do,” Ortiz says. “Most of the respondents didn’t think this was a common behavior among people their age, but you can see there is a good chunk of young people who are using it for these purposes.”
What Should Be Asked Next
Ortiz says there is not a clear indication that using AI companions or chatbots for romantic companionship is leading to healthier outcomes for most users.
“Unfortunately, the results show that, for some users, engaging with these AI companions has the potential to be related, not necessarily causing, but related to less healthy romantic beliefs and behaviors,” she says.
Ortiz hopes her work can serve as a warning sign to people creating companion apps or platforms like ChatGPT that boundaries and guardrails should be built-in so users can engage in healthy, safe ways.
People are building real relationships with AI companions, and the goal should be to understand and ensure there are healthy outcomes, without being too judgmental, she says.
“AI companionship is not going away,” Ortiz says. “So the question should be, how can this be used in more helpful than harmful ways if we know young people are going to use it? It’s just another tool for young people to help make sense of themselves, and we should be open to understanding that if we want to help them build healthy relationships.”