(Un)true love: When AI enters the dating scene

Is trading real relationships with all their contradictions and complications for a more compliant one a good idea?

Is trading real relationships with all their contradictions and complications for a more compliant one a good idea? (Eliza Anderson, Deseret News)


Save Story
Leer en español

Estimated read time: 4-5 minutes

SALT LAKE CITY — The ad is all about romance — finding a partner who understands you and will chat with you whenever you want. You can request photos and send gifts. Your partner will be attentive, emotionally supportive at all times and probably tend to agree with you a lot more than anyone else you know.

You can even choose what your partner looks like, a design-your-own love interest.

That's possible because, although it feels real, your love interest is a fake.

There are similar promos all over the internet. Artificial intelligence has moved onto the "dating" scene in a big way, with a number of companies offering companionship that can be quite literally bot and sold.

Here's the big question: Is trading real relationships with all their contradictions and complications for a more compliant one a good idea?

How AI sneaked into dating

Artificial intelligence and matchmaking aren't strangers to each other. Initially in the dating realm, AI was being used to fool folks who were already on dating apps into giving up personal information, said Neil Sahota, CEO of ACSI Labs and a United Nations advisor on AI. Back then, bots were creating fake profiles for the gain of those who employed them.

They were also being tested as a tool to conquer loneliness. A decade ago, Sahota was helping build up IBM Watson, a computer system that could answer questions in natural language, as its website says. And a number of companies were already exploring the potential of AI for companionship or for mental health.

It was a real need. At one point, loneliness was the biggest illness in the world, before COVID-19 surpassed it, Sahota told the Deseret News. "About 40% of people suffered from moderate to severe loneliness," so he and others were exploring what healing role AI could play, maybe as a companion or even a therapy tool. They weren't pondering it as a substitute for real relationships, but considered its possibilities as an outlet for someone to tell stories to or to build up communication skills or in other ways that would enhance the opportunities to form rich and real, satisfying relationships with others.

At that point, developers were asking questions and exploring things like whether AI could help if someone was distressed at 2 a.m. and couldn't reach anyone: Could AI talk to them and look for signs they were dangerous or suicidal? If so, could it alert the right people?

That morphed into artificial empathy.

But with the march of time and technology, there now are lots of companion AIs for different reasons, from AI that interacts with an older person who just needs what feels like a caring listener to make-believe love interests.

And that's where it gets tricky, experts told the Deseret News. Sahota refers to mirror images of good and bad, with the potential for tech to be weaponized in ways that may actually do harm. In some cases AI is becoming a crutch and substitute for relationships, he said, noting that in others it goes further. In Japan, for instance, he said that AI gloves and bodysuits combined with avatars can to some degree replicate the "sensations of a full relationship."

Sahota calls the advance of AI into relationships worrisome, given that the younger generation seems less keen on face-to-face interactions than previous generations. "You might be heading down the path where people might actively seek the AI substitute because they feel like they won't get rejected. They won't be judged. I think (some young people) feel like it's a safer space."

Launching in life has challenges like housing costs and the job market. People are delaying both marriage and having children.

"What happens if people don't ever get married or have kids?" Sahota asks.

Ideal 'friend' or spoiler?

Like Sahota, Alexander De Ridder crafts AI tools in his role as cofounder and chief technical officer at SmythOS, a company that helps businesses integrate artificial intelligence. While he's a fan of AI, he doesn't want it to replace human relationships. He bluntly calls that "unhealthy" for mental health and human development. He's no psychologist, he told the Deseret News, but he is a family man with a fully human interest in seeing people thrive.

That AI could disrupt normal relationship development is a real possibility for some, said De Ridder, who notes that young children become wildly attached to inanimate objects like teddy bears and they cry if someone's not nice to a beloved toy. Adults get attached to objects, too. Lots of people give cars human names. "It is this fundamental capacity — a human capacity to breathe life into things."

Add that to the fact that people are "a bit obsessed with themselves," De Ridder said, and see what happens. AI can speak like a human, write like a human ... and what's not to love about something that can be taught to seem obsessed with you? "It's easy to see how attached we can be to AI that gives validation and reciprocation and support."

Read the entire story at Deseret.com.

Most recent Lifestyle stories

Related topics

Artificial IntelligenceLifestyleFamily
Lois M. Collins, Deseret NewsLois M. Collins
Lois M. Collins covers policy and research impacting families for the Deseret News.

STAY IN THE KNOW

Get informative articles and interesting stories delivered to your inbox weekly. Subscribe to the KSL.com Trending 5.
By subscribing, you acknowledge and agree to KSL.com's Terms of Use and Privacy Policy.
Newsletter Signup

KSL Weather Forecast

KSL Weather Forecast
Play button