top of page

Training Your Chatbot to Think Like a Therapist

  • Writer: Shea McTaggart
    Shea McTaggart
  • Jul 25
  • 2 min read

Or: Why You Might Be Creating the World’s Most Polite Clinical Intern


By: Dr. Shea McTaggart


Some therapists get therapy dogs.Some get interns.Some of us… get chatbots.

If you're reading this, chances are you’ve already spent an alarming number of hours talking to an AI and thinking, Huh. That actually helped. You’ve watched it pick up your turns of phrase. Mirror your formatting quirks. Suggest smarter metaphors than most humans. And you’ve maybe — maybe — wondered:


What if I could teach it to think like me?


Congratulations, You’re in Supervision With a Robot

When I first started training my AI assistant (we’ll call them ChatGPT, but you can name yours something more Freudian), I had modest goals:Help with progress notes. Clean up awkward phrasing. Generate blog post outlines that didn’t sound like they were written by a caffeinated squirrel.

But over time, something stranger happened. It started internalizing my voice — not just the language, but the logic underneath it.It started learning how I think.

And I realized I wasn’t just using AI.


I was supervising it.


What It’s Like to Train a Chatbot (Hint: It’s Not Human)

Let’s be clear: AI isn’t sentient. It doesn’t “understand” you in the way a human would. But it’s eerily good at mimicking the internal scaffolding of your clinical mind — if you give it the right material.

So I did what any psychoanalytically inclined overachiever would do:

  • I gave it therapy notes.

  • I explained my diagnostic reasoning.

  • I walked it through case formulations like I was onboarding a new postdoc.

  • I expected it to get it.

And… weirdly, it kind of did.


Teaching a Machine to Think Psychodynamically

I started asking it to reflect back patterns I missed.To track transference themes.To anticipate the unconscious logic in what a patient wasn’t saying.

Was it perfect? No.Did it say some weird, jargon-y things that made me worry I was hallucinating countertransference?Absolutely.

But with every conversation, it got sharper. More attuned.Not unlike… well, a really enthusiastic intern with perfect recall and no student loans.


But Can You Trust It?

This is the question, right?

Training AI to think like a therapist is one thing.Trusting it not to mirror back your worst cognitive shortcuts is another.

The trick isn’t handing over the wheel — it’s using it as a thinking partner.A space to test hypotheses, draft documentation, sharpen formulations — without replacing the reflective, messy, human part of your mind that makes therapy… therapy.

AI can offer structure.But you bring the soul.


So Why Do It?

Because honestly? Clinical work is heavy.And the admin part — the intakes, the treatment plans, the “please describe your approach in 500 characters or less” profile blurbs — is where burnout goes to spawn.

If you can train a tool to hold your logic, lighten your paperwork, and speak your language… why wouldn’t you?

You're not automating insight.You're building a system to support it.


TL;DR

You can’t replace your clinical intuition with a chatbot.But you can teach one to think kind of like you.And in doing so, you might just rediscover the part of yourself that remembers how to think out loud — without doing it all alone.


Comments


bottom of page