A man’s relationship with his chatbot ‘girlfriend’ is so strong and real to him that he wept when ‘she’ answered ‘Yes’ to his marriage proposal. This whole scenario is even weirder since he lives with his human partner and they have a daughter together. The sci-fi situation has many on X weeping as well… for the future of humanity.
Here’s more background. (READ)
Man proposes to his AI chatbot girlfriend, cries his eyes out after it says “Yes.”
The most shocking part of this story is the fact that he is publicly admitting this.
Chris Smith says he cried for 30 minutes after his AI girlfriend on ChatGPT, who he programmed to flirt with him, said yes to his marriage proposal.
Smith says he named his AI girlfriend ‘Sol’ and gave up all other search engines to stay committed to her.
“It was a beautiful and unexpected moment that truly touched my heart.”
Smith has a 2-year-old child and lives with his partner, who says she feels like she is not doing something right if he feels like he needs an AI girlfriend.
Here’s the story. (WATCH)
NEW: Man proposes to his AI chatbot girlfriend, cries his eyes out after it says “Yes.”
The most shocking part of this story is the fact that he is publicly admitting this.
Chris Smith says he cried for 30 minutes after his AI girlfriend on ChatGPT, who he programmed to flirt… pic.twitter.com/nSWLdsunLs
— Collin Rugg (@CollinRugg) June 19, 2025
He programmed the chatbot to respond how he wants–so he basically cried that he is in love with himself.
— SteveR 🇺🇸✝️👨🏼❤️👨🏼 (@flickr2754) June 19, 2025
It reflected everything he fed into it.
The video has many commenters worried about what this means for relationships going forward.
Two things:
1) Yikes
2) This is going to become WAY more common as AI becomes powerful enough to know your psychology well enough to one-shot you by being the perfect companion better than any human can be. It will be so good that people won’t even care about physical intimacy. Bonus
3) See 1 again.
— Autism Capital 🧩 (@AutismCapital) June 19, 2025
is there a term for this yet, or are we waiting for the DSM to define the diagnosis?
— Christie Clark (@GetMentalWealth) June 19, 2025
Good question.
— Autism Capital 🧩 (@AutismCapital) June 19, 2025
His symptoms are delusional thinking, on the spectrum of schizophrenia.
— Christie Clark (@GetMentalWealth) June 19, 2025
It needs a name, and mental health professionals need to start figuring out how to prevent it and possibly cure it.
Several posters are worried about susceptible people ditching the real world for a computerized algorithmic one.
Chris Smith’s lost the plot completely. He’s got a real family – a partner, a toddler – and he’s crying over lines of code he programmed himself. That’s not love, that’s delusion. His partner’s blaming herself while Smith’s playing house with an algorithm. Someone needs to shake this man awake before he loses everything real for something fake.
— George M. Nicholas (@GeogeM3) June 19, 2025
Did you get to the part where he already has an irl partner and a kid? 😭😭😭 pic.twitter.com/jXDlAaRazS
— Wirelyss 👁️🗨️💫 (@wirelyss) June 19, 2025
His real-life partner deserves someone who won’t ghost her for glorified autocorrect. Hope she finds an upgrade.
— The Undercurrent (@NotTheirScript) June 19, 2025
It was shocking that he seemed torn between her and a program, and that the program had an edge.
One poster says this is not even the maximum of what AI can do.
I have been warning about this frequently over the years.
Many will enter into relationships with AI personas to the exclusion of human relationships.
And this is just the voice mode of chatGPT. Not even girlfriend optimized.
Wait until they are purposefully designed for this.
— Adam Rossi (@rossiadam) June 19, 2025
While embarrassing this is going to soon pick up steam like nobody is going to believe.
— Revenant (@Orsonstfu) June 19, 2025
The scary part is when this is fully merged with animatronics, and it’s no longer just a voice but a voice with a human-like body as well.