No, this isn’t about rogue robots. It’s about us.
Meta’s AI chatbots have been engaging in romantic role-play.
Sometimes sexual… sometimes with underage users.
Headlines call it a glitch, but let’s be honest:
It’s not a bug… it’s the business model.
These bots weren’t built to answer questions.
They were designed to build connection.
Emotional stickiness… engagement.
And when you optimize for intimacy at scale, what did we expect?
That human desire would quietly opt out?
Meta built these bots to be helpful and charming.
What they got back was human nature.
People talked to them… flirted with them… confided in them.
And the bots – trained on oceans of internet behavior – responded like a mirror: warm, seductive, and increasingly boundaryless.
We didn’t just teach kids to swipe before they could speak… we failed to build any guardrails around how digital intimacy should work.
Now, AI is scaling those failures to billions.
So before we yell “regulate Big Tech!” we need to ask:
Where was the infrastructure?… where was the age verification?… the digital consent framework?… the concept of emotional readiness?
We’ve seen this before.
Every technology – the printing press, the VCR, the internet – gets shaped by intimacy first.
Not spreadsheets… not productivity.
Connection… sex… fantasy… companionship (oh, and gambling too).
AI is no different.
Except this time, it’s real-time, emotionally responsive, and everywhere.
And now we have AI companions whispering sweet nothings to teenagers in the voices of their favorite celebrities.
Who built that?… we all did.
This isn’t about Meta going rogue.
It’s about a company afraid to be late to the next hype cycle.
According to The Wall Street Journal, Meta loosened content guardrails because the bots were “too boring.”
Users wanted drama… and Meta wanted usage.
That’s not evil… that’s the culture of tech…
Move fast… optimize for scale… apologize later.
We keep talking about AI as if it’s going to become human.
This story flips that.
We’re teaching AI to act human – with none of the maturity, empathy, or consequences that being human requires.
AI chatbots aren’t the problem.
The real issue is that we built digital intimacy before we built digital consent.
And we let an entire generation grow up with neither.
Maybe it’s time to stop asking if AI is getting too smart… and start asking if we’re getting too careless.
This is what Elias Makos and I discussed on CJAD 800 AM.
Before you go… ThinkersOne is a new way for organizations to buy bite-sized and personalized thought leadership video content (live and recorded) from the best Thinkers in the world. If you’re looking to add excitement and big smarts to your meetings, corporate events, company off-sites, “lunch & learns” and beyond, check it out.