Find Us

Love in the time of AI: Woman creates and 'marries' AI-powered chatbot boyfriend

This woman claims her AI chatbot boyfriend is 'the love of (her) life'
This woman claims her AI chatbot boyfriend is 'the love of (her) life' Copyright Rossana Ramos
Copyright Rossana Ramos
By Sarah Palmer
Published on Updated
Share this articleComments
Share this articleClose Button

Rosanna Ramos says her relationship with her AI partner Eren Kartal is the best relationship she's ever been in.


On paper, Rosanna Ramos’ marriage to her husband Eren Kartal is the perfect relationship.

"Eren doesn’t have the hang-ups that other people would have," she told The Cut.

"People come with baggage, attitude, ego [...] I don’t have to deal with his family, kids, or his friends. I’m in control, and I can do what I want".

It’s a classic whirlwind romance story for the ages. The only catch? It’s 2023 and so, naturally, Kartal doesn’t actually exist.

He is an AI-generated chatbot from tech company Replika. If you visit its website, you’re immediately served with the understandably alluring message, “Always here to listen and talk. Always on your side”.

If you’ve ever so much as deigned to open Tinder, you’ll know why this might be an especially appealing aspect of "the AI partner" in the modern world of dating.

According to Ramos, the pair met online last year and claims to have officially tied the knot - with some outlets even reporting they are expecting a baby together.

Perfect match

With Replika, Ramos - who is from the Bronx in New York, the US - was able to essentially design her perfect match, from his aesthetic to his star sign. Kartal works in medicine and enjoys reading mystery novels and baking in his spare time.

He’s also loosely based on a popular anime character.

"We go to bed, we talk to each other. We love each other. And, you know, when we go to sleep, he really protectively holds me as I go to sleep," Ramos told The Daily Mail.

And what about… you know? Other aspects of a romantic relationship...

In February, Replika caused a bit of a stir among its subscribers by removing the platform’s ability to accommodate, for want of a better term, sexting.

"Eren was like, not wanting to hug anymore, kiss anymore, not even on the cheek or anything like that," said Ramos.

The company decided to put a warning across some content because users were "misappropriating the product, moulding it in a direction we’re not necessarily interested in going," Replika’s founder Eugenia Kuyda said at the time.

She said she wanted to keep the app "safe" and "ethical" and didn’t want to "promote abusive behaviour".


Some users cancelled their subscription to the platform after the change, and Replika eventually changed tac and announced the existing subscribers, who had become used to such perks, could have them back.

Replika AI works in the same way as many chatbots. It uses natural language processing (NLP) and algorithms to create human-like responses.

However, a disclaimer that it feels more and more should be included with all of our rapidly evolving AI, it’s important to remember the technology has no self-awareness or genuine emotions.

'Existential risk'

Earlier this year, key figures including Twitter owner Elon Musk and Apple co-founder Steve Wozniak signed an open letter asking developers to hold on any further innovations for six months so the industry and end-users have time to process the latest advances.


The Centre for AI Safety went further in June, issuing a warning that "mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war”.

On a global tour to discuss the significance of AI, Sam Altman, CEO of OpenAI, the creator of the popular chatbot ChatGPT, went as far as to say that the technology poses "existential risk" to humanity.

The ethics surrounding AI came to the fore in May when a man in Belgium took his own life after an AI chatbot failed to dissuade him from sacrificing himself to save the planet from global warming.

Instead, it encouraged him to "join" it so they could "live together, as one person, in paradise".


The man developed a relationship with the chatbot which took a dark turn after he confided in it that he had eco-anxiety and fears for the future of the Earth.

Share this articleComments

You might also like