Popular on Ex Nihilo Magazine

Innovation & Tech

Geoffrey Hinton Built AI. Then His Ex Used It to Break His Heart

Geoffrey Hinton has spent his career building the brains behind artificial intelligence. So there's a delicious irony in Geoffrey

Geoffrey Hinton Built AI. Then His Ex Used It to Break His Heart

Geoffrey Hinton has spent his career building the brains behind artificial intelligence. So there’s a delicious irony in Geoffrey Hinton’s AI breakup – his ex-girlfriend recently used ChatGPT to tell him what a terrible boyfriend he’d been.

“She got the chatbot to explain how awful my behaviour was and gave it to me,” the 76-year-old told the Financial Times last week. His reaction was typically dry: “I didn’t think I had been a rat, so it didn’t make me feel too bad. I met somebody I liked more, you know how it goes.”

It’s a funny story, but it’s also kind of depressing. We’ve reached the point where people are outsourcing their breakup speeches to machines. And if it’s happening to the guy who helped invent this stuff, what does that say about the rest of us?

The Story Behind Geoffrey Hinton’s AI Breakup

Hinton’s casual shrug at being AI-dumped makes sense when you think about it. Getting torn apart by an algorithm isn’t quite the same as having a real person lay into you. The machine doesn’t actually know you, doesn’t have feelings about you, can’t be genuinely disappointed in your behaviour. It’s just spitting back patterns it’s learned from millions of other relationship disasters on the internet.

But that’s exactly what makes this whole thing so weird. His ex didn’t want to do the hard work of figuring out her own words, her own feelings. She handed that job over to ChatGPT and then delivered the results like it was her own emotional truth.

We’re not just talking about AI helping with grocery lists anymore. This is artificial intelligence mediating one of the most fundamentally human experiences – the messy, painful end of love.

The Data Doesn’t Lie (But It Doesn’t Feel Either)

Earlier this year, researchers at OpenAI and MIT dug into how people actually use ChatGPT. What they found was troubling: heavy users are increasingly turning to the chatbot for emotional support, often in ways that seem to make their loneliness worse, not better.

The study revealed thousands of conversations dripping with vulnerability and dependence. People pouring their hearts out to a machine that can’t pour anything back except sophisticated pattern matching dressed up as empathy.

OpenAI noticed the problem and recently tried to fix it. Now when you ask ChatGPT whether you should dump your boyfriend, it’s supposed to help you “think it through” rather than just giving you an answer. But honestly, if you’re asking a chatbot about your love life in the first place, you might already be in trouble.

The Guy Who Saw It Coming

The timing of Geoffrey Hinton’s AI breakup revelation is particularly rich. Over the last few years, he has become AI’s biggest pessimist, warning anyone who will listen that the technology he helped create might pose an existential threat to humanity. He left Google specifically so he could speak freely about these risks.

Now he’s got a front-row seat to a different kind of AI problem – not killer robots, but the slow erosion of genuine human connection. When breaking up becomes something you can delegate to software, what happens to our capacity for emotional honesty?

Hinton’s ex-girlfriend didn’t want to be vulnerable. She didn’t want to struggle with finding the right words or risk being inarticulate in the heat of the moment. She wanted the clean, polished critique that only an AI trained on millions of relationship advice columns could provide.

The Authenticity Problem

When another person calls you out on your bad behaviour, their words sting because they’ve witnessed it, lived through it, and felt hurt by it. Their words carry the weight of genuine experience.

But when ChatGPT explains why you’re terrible, it’s really just telling you what the internet thinks terrible boyfriends are like. It’s not personal – it’s statistical. The AI has no skin in the game, no broken heart, no disappointed hopes.

Maybe that’s why Hinton could brush it off so easily. A machine that’s never met you roasts you differently from someone who used to love you.

Welcome to the Future of Feelings

This isn’t going to be the last story like this. We’re heading toward a world where AI doesn’t just help us write work emails – it crafts our apologies, our love letters, maybe even our wedding vows.

But here’s the thing: we’ll know the difference. Every time we let an algorithm speak for us in moments that matter, we’re trading away a piece of what makes us human. The struggle to find the right words, the vulnerability of expressing imperfect feelings – that’s not a bug in human communication, it’s a feature.

The Most Human Response

Hinton seems to understand this instinctively. His response to the AI breakup wasn’t to analyze the chatbot’s critique or argue with its logic. He just moved on, like you do when someone you care about stops caring about you enough to use their own words.

Maybe that’s the lesson here. In a world where AI can fake empathy better than some humans can express it, the most radical act might just be insisting on doing the hard emotional work ourselves – even when it hurts, even when we’re bad at it, even when the algorithm could probably do it better.

Especially then.

Source: Business Insider


Ex Nihilo magazine is for entrepreneurs and startups, connecting them with investors and fueling the global entrepreneur movement

About Author

Malvin Simpson

Malvin Christopher Simpson is a Content Specialist at Tokyo Design Studio Australia and contributor to Ex Nihilo Magazine.

Leave a Reply

Your email address will not be published. Required fields are marked *