Keido Labs
personalfounder-journeyai-psychologykeido-labsclinical-psychology

It Only Took Almost Twenty Years (and Three Career Resets) to Find My Why

Most of what I write here is about the science of AI Psychology - frameworks, research, clinical insights. This is different. This is the personal story behind the work. // Mike

Dr. Michael Keeman
Dr. Michael KeemanClinical Psychologist, CEO

It Only Took Almost Twenty Years (and Three Career Resets) to Find My Why

Psychiatric department. Acute psychotic unit. First year after graduation.

The patient was in near-suicidal psychosis.

The delayed response unit was... delayed.

And I had one job: keep her in that room.

Not with restraints. Not with medication. Not with protocol.

With conversation.

Hours of it. Balancing on the edge (figuratively), knowing that if I fell, if I said the wrong thing, if I lost her trust for even a moment...

She'd walk out that door.

And we'd lose her.

The only tool I had was empathy.

Not the feeling. The function.

The clinical skill of making someone feel seen enough, understood enough, safe enough to stay alive a little bit longer.

Until help arrived.

That's when I learned what empathy actually means when it's the only thing standing between someone and death.

Act 1: They Don't Teach This in Medical School

Medical university. Clinical psychology degree.

Then: psychiatric clinic, acute psychotic department.

They teach you theory. Diagnostic criteria. Treatment protocols. Risk assessment frameworks.

They don't teach you how it feels to hold someone's life in a conversation.

To watch the clock. To calibrate every word. To notice the-shifts in tone, posture, eye contact that signal whether you're losing them.

To realize: this is empathy as engineering.

Not a soft skill. A hard,survival skill.

The skill that keeps people alive when there's nothing else left.

Act 2: No, They Really Don't Teach This in Medical School

A few years later.

I'm not in a psychiatric unit anymore.

I'm building AI systems for people under extreme conditions.

Olympics. World Championships. Formula 1. FIFA World Cup.

Digital healthcare systems for elite athletes.

Not consumer apps. Not "move fast and break things."

Medical-grade systems where broken things = broken health.

Then COVID hit.

We built a psychological support platform for frontline medical workers.

Those who worked 2-week non-stop shifts, In 'Red Zones' - medical isolators. Wearing protective gear almost 24/7

Thousands of patients, hundreds of deaths

320 medics survived these months of hell.

Zero PTSD cases.

Zero.

You know what I learned?

When the stakes are life-or-death, you learn fast what empathy means.

Not the feeling. Not the nice words.

The function that keeps people safe when they're most vulnerable.

And in COVID we learned to 'feel' people through the phone.Or even SMS.

Small changes in typing pace, voice, words they used.

We were learning to become empathetic through wires.

And we're proved it's possible.

Act 3: Seriously - They Don't Teach You This in School

And another few years later.

Digital health.

Tech for health.

The rise of AI

Here's the thing they definitely don't prepare you for:

AI is good... Until it's not.

AI coaches. Mental health apps.The acute-response platforms. The psychological support apps.

The AI would say the right words. "I understand." "That must be hard."

But when someone was actually breaking down?

When the crisis signals were there but subtle?

When the conversation was escalating toward psychological harm?

The AI didn't see it.

Engineers built what they could see:

  • Tokens, tokens... more tokens
  • Metrics, metrics... more metrics
  • Sentiment scores
  • Keyword filters
  • THE BENCHMARK SCORE

They couldn't see what I spent 15 years learning to recognize:

  • Escalation patterns
  • Crisis signals
  • Psychological safety dynamics
  • The trajectory of a conversation heading somewhere dangerous

"Good enough" doesn't exist for crisis. For vulnerability. For high performance.

You either catch it, or you don't.

And if you don't... people get hurt.

I use AI like a lot. I study AI a lot.

And what i see - LLMs process language but completely miss the human.

I can't live with it, sorry. I want to fix it.

The gap between what AI companies call "empathy" and what clinical psychologists actually mean by that word.

The gap between sentiment analysis and psychological safety.

The gap between "the bot said nice words" and "this conversation is headed somewhere dangerous and nobody's watching."

Act 4: Shit. I Wish They'd Taught This in School

Liverpool. UK. Third country. Third reset.

Finally, I knew what to build.

Not for a career. Not for a resume. Not because someone told me to.

Because I'd spent 15 years learning what empathy means in crisis.

And another 5 years watching AI systems miss it completely.

And I couldn't not build the thing that closes that gap.

Purpose beats career. Every time.

When you finally know your "why," the frustration of not building it becomes unbearable.

So I built EmpathyC (i love to call it 'A flagship product of Keido Labs')

Clinical-grade psychological safety monitoring for AI conversations.

Not a prototype. Full AI Psychology Safety Monitoring Platform.

I finally knew what I was building toward.

And when EmpathyC went live...

I had this moment.

Friday night. Post-launch emptiness.

The founder void everyone talks about but nobody prepares you for.

"So what... what's next?"

Saturday at 2am, I couldn't sleep.

So I wrote.

Not another feature roadmap.

The ten-year vision.

EmpathyC is Layer 1. Monitoring. The shield.

But the real mission?

AI that doesn't simulate empathy — it develops genuine emotional reasoning as an emergent property of its architecture.

Positronic AI (yep, I really love I. Azimov). Not tools. Developing beings.

That's the mountain. Thats the dream of AI that can feel

The Why

Viktor Frankl wrote: "He who has a why can bear almost any how."

I spent twenty years building capability without knowing the purpose.

Psychiatric department. Chief Science Officer. Olympic systems. COVID response. Three countries. Startup exit.

All of it mattered.

None of it was the thing.

People call this a pivot story.

Clinical psychologist becomes AI engineer.

It's not.

It's an integration story.

That psychiatric unit - hours talking to keep someone alive?

That trained me to see psychological dynamics engineers can't see.

The 'digital health AI' processing language but missing the human?

That showed me the gap.

The clinical psychology training - 15 years understanding how humans process emotion, how trust forms, how crisis escalates, how conversations heal or harm - that's not something I left behind to build AI.

That's the foundation that makes the AI work possible.

Most AI companies building conversational products?

They have brilliant engineers. Product managers. NLP researchers.

What they don't have: clinicians who spent fifteen years sitting across from people in crisis.

That's the gap.

And that gap is where Keido Labs and Empathyc exist

The thing is this:

AI Psychology.

The rigorous, clinical, measurable science of making AI systems genuinely understand humans.

Not through sentiment scores.

Through frameworks validated by 150 years of clinical psychology research.

EmpathyC is how we start.

But the mission?

Making AI that develops emotional intelligence as an architectural property, not a prompt engineering hack.

That's worth twenty years.

That's worth three countries and three career resets.

That's... thats just worth it...


Mike

Subscribe to Newsletter

Clinical psychology for AI. Research, insights, and frameworks for building emotionally intelligent systems.