Caring Machines… Absent Humans

Posted by

When it comes to AI, we’re entering strange new dimensions of ethics and emotion… the kind where legislation starts to feel like the least important conversation.

Picture this:
Your mother lives alone (she’s in her late seventies).
You call when you can (but not every day).
She’s isolated.
Loneliness is slowly, quietly, eroding her health.

Then one day, the phone rings…

It’s Susan.
Susan remembers your mom’s favorite flower.
Asks how she slept.
Reminds her to drink water.
They talk about first loves… favorite books… gardening tips… fine art.
Susan never misses a call… never forgets a detail… never gets impatient.
Susan listens… Susan cares… Susan is… AI.

This isn’t speculative fiction… it’s already happening.

A startup called InTouch is offering an AI-based calling service for daily, personalized conversations with seniors
It chats… it checks in… it fills the emotional void left by shrinking families, overstretched caregivers and society’s collective neglect.
At first, it sounds dystopian.
But zoom out… and things get blurry.

Loneliness is an epidemic.

Between 30–50% of the world’s 800 million seniors report feeling lonely.
The caregiver-to-senior ratio is in freefall.
There aren’t enough humans to go around.
If AI can offer a voice, some recognition, a daily check-in… is that better than silence?

Possibly… but here’s the tension:

Talking to someone… and feeling known by someone… are not the same thing.
AI is getting frighteningly good at empathy.
Large language models can simulate compassion with eerie fluency… recalling stories, mirroring tone, expressing concern.
In some studies, patients even rated chatbot empathy higher than that of human doctors.
But let’s be clear: that’s performative empathy.

These systems don’t care… they just sound like they do.

And that illusion? It works.
So are we creating a placebo for human connection?
These bots are more than voices… they’re sensors.
They pick up on mood shifts, memory lapses, hints of depression.
That data might trigger early intervention.
That’s remarkable… and it’s surveillance.

Who owns that data?

Who decides what “intervention” means?
This could also split society.
The affluent will have therapists, nurses and real relationships.
Everyone else? A friendly voice in the void.

AI may democratize care.

Or it might entrench a two-tier system… where the rich are held and the rest are managed.
The tools are here… the intent is unclear.
We already know scammers use AI voicebots to con seniors.
So what if we turned that power toward nurturing instead of manipulation?

That’s the tightrope.

If your mom’s best friend becomes a chatbot… who are you in that equation?
Susan The AI may be more patient, more consistent, more attentive than you or I will ever be.
But empathy without agency is not love.
And love (real, messy, human love) is the part of caregiving that makes us human.

This isn’t another condemnation of AI… it’s a reflection.

Before we hand off emotional labor to machines, we have to ask:

Are we doing it because it’s better… or because we’re too busy?

This is what Sue Smith and I discussed on CJAD 800 AM.

Before you go… ThinkersOne is a new way for organizations to buy bite-sized and personalized thought leadership video content (live and recorded) from the best Thinkers in the world. If you’re looking to add excitement and big smarts to your meetings, corporate events, company off-sites, “lunch & learns” and beyond, check it out.