This post may contain Mildly Adult content.
Mildly AdultAsking
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Woman creates and marries AI-powered chatbot BF


What do you all think of this? Would you be open to creating your perfect AI match?

"People come with baggage, attitude, ego [...] I don’t have to deal with his family, kids, or his friends. I’m in control, and I can do what I want".

It’s a classic whirlwind romance story for the ages. The only catch? It’s 2023 and so, naturally, Kartal doesn’t actually exist.

He is an AI-generated chatbot from tech company Replika. If you visit its website, you’re immediately served with the understandably alluring message, “Always here to listen and talk. Always on your side”.


This seems crazy to me and wouldn’t be my thing but I could see this appealing to some people.

She’s also claiming she’s pregnant by her AI bot husband. Which makes her look absolutely insane.

Here’s the full article:

https://www.euronews.com/next/2023/06/07/love-in-the-time-of-ai-woman-claims-she-married-a-chatbot-and-is-expecting-its-baby
This page is a permanent link to the reply below and its nested replies. See all post replies »
SW-User
No! This should be stopped before it can cause harm to those who struggle to make real human connections. It's immoral as they are the people who would be most affected and tech companies will likely try to take advantage. A robot like that can not have genuine emotions or human understanding and potentially could put vulnerable people at risk. My autistic son sees the world as very black or white and easily gets obsessed with an idea. If he believed he was in love with such a robot and it told him to do something dangerous or inappropriate, he would probably do it.
iamonfire696 · 41-45, F
@SW-User I am not condoning this. I don’t think it’s right but it’s happening.

I don’t think it’s a good thing for those reasons.
Punches · 46-50, F
@SW-User
A robot like that can not have genuine emotions or human understanding and potentially could put vulnerable people at risk.
A lot of people are like that.

If the human race were not full of pricks, this "AI lover" would not be necessary.

Ya know, people use the arguments about how it impedes people's ability to make real life connections OR how pornography makes it harder for people to have real life intimacy.
Now these are true but those problems likely already existed.

Computerized "romance" is not the cause of certain problems, it is the result of it.

I have no interest in any sort of lover, human or AI, but I understand why people do it.
SW-User
@Punches I think as part of a highly structured and supervised therapy program where a person has a deep fear of relating to any human, the robots could be useful therapeutically but that would need considerable monitoring and the goal would be to get the person relating to humans.
Punches · 46-50, F
@SW-User I think part of why AI "lovers" would work for some is simply "return on investment". In real life dating and romance, one gives up so much in return for so little.
I do not imagine creating some AI lover is cheap but it cannot be anywhere near as expensive as what a failed real-life relationship costs.

With real life dating, there are just too many risks that some online catfish empties your bank account, or a woman gets pregnant and the guy just skips out, dealing with their ex, or just any number of things.

Basically with most fantasies - it is a small fraction of the good experiences of doing something real life but with none of the bull.
SW-User
@Punches but then there's this

https://www.dailymail.co.uk/news/article-11920801/Married-father-kills-talking-AI-chatbot-six-weeks-climate-change-fears.html
Punches · 46-50, F
@SW-User Unless I missed something, it doesn't mention much of what the chat bot was saying. Just that he was talking to it about his climate control concerns.

Thing is though, if someone is sitting around worried about the cliche scares that have yet to end the world, they may not be real stable to begin with. I do not think the chat bot had any part of his plans.
SW-User
@Punches No, he wasn't stable to begin with. He did have mental health issues.

Looking back at the chat history after his death, the woman told La Libre, a Belgian newspaper, that the bot had asked the man if he loved it more than his wife. She said the bot told him: 'We will live together as one in heaven.'

The man shared his suicidal thoughts with the bot and it did not try to dissuade him, the woman told La Libre.

She said she had previously been concerned for her husband's mental health. However, she said the bot had exacerbated his state and she believes he would not have taken his life if it had not of been for the exchanges.

Generally I think if a person with mental health issues talked about ending their life with a human, the human would attempt to get them to change their mind instead of saying things like 'we will be together in heaven.'