If you caught the Law & Order episode “No Good Deed” recently, you already know where we’re headed. In it, a man on trial for the murder of his father claimed he was seeking guidance—not from a licensed therapist, but from an AI chatbot.
His alibi? “My AI therapist told me to.” The defense team attempted to invoke therapist-patient privilege, only for ADA Nolan Price (played by Hugh Dancy) to drop the hammer: A machine that performs a simulacrum of a conversation cannot be considered to be having a conversation. Consequently, there was no therapist-patient privilege, the conversation was entered into evidence, and the defendant was convicted.
Welcome to 2025, the age of asking whether your AI can give you life advice and get you fired.
While your AI chatbot might not qualify as a therapist under the law, it might still guide your mental health and supervise your team at work. Two recent podcast episodes, The Good Guys and The Intersect with Cory Corrine, tackled this exact blurry, bot-lined boundary.
Let’s start with The Good Guys, where the hosts joked (but not really) that if you’re constantly venting to friends who didn’t ask for it, maybe your first stop should be, oh, literally anyone else. Like, say, ChatGPT. Or a real therapist. Or better yet, a “resilience partner.”
Josh and Ben broke it down in their usual brutally honest style: “If you’re coming to me with a problem, I’m gonna tell you how to fix it. If you just want to talk to a wall, go find one. Hell, use ChatGPT. At least it won’t get annoyed when you trauma dump.”
AI-powered platforms are increasingly used as emotional dumping grounds, productivity tools, and makeshift coaches. Which leads us to…
On The Intersect with Cory Corrine, the conversation took a fascinating turn with Katharine Von John (a.k.a. KVJ), the founder and CEO of Tough Day—a workplace AI described not as a chatbot but a resilience partner. Think of it like a mentor, coach, HR rep, and occasional therapist all rolled into one algorithm.
“We call it ‘Tuffy’ because it doesn’t coddle,” said KVJ. “It’s modeled after therapists, lawyers, and great HR reps—it doesn’t just give you answers. It asks the right questions to help you figure it out yourself.”
In other words, Tuffy is the no-nonsense work mom you didn’t know you needed.
Tuffy is designed to help you navigate everything from a passive-aggressive manager to imposter syndrome to whether that sketchy Slack message was HR-appropriate. It won’t just diagnose the issue—it’ll follow up with you, help you prep for difficult conversations, and even send you a Monday morning recap of your work goals and wins.
It’s a digital therapist, boss, and accountability partner. And users love it—so much so that in Hawaii, where Tuffy was customized with a little more “aloha spirit,” the average user spends 60 minutes per session and checks in 14 times a month.
Sixty. Minutes. With an AI.
“I think we’re seeing people trust it,” said KVJ. “They can say things to Tuffy that they wouldn’t feel comfortable saying to a human manager.”
If you’ve ever sat through a workplace 1-on-1 that felt more like a hostage negotiation than professional development, the appeal of a judgment-free, highly trained, emotionally intelligent AI coach makes a lot of sense.
Part of what inspired Tuffy was the growing phenomenon of “accidental therapists” in the workplace—usually middle managers who bear the emotional weight of everyone’s problems without the training or tools to do so.
Cory, a longtime manager herself, summed it up perfectly: “Most people wouldn’t pay $1 for an hour of their manager’s time. That’s wild. But it’s because most managers are overworked, undertrained, and not equipped to coach. And yet we expect them to also be our emotional support humans?”
So instead of expecting managers to be all things to all people, AI like Tuffy can handle the transactional stuff so that real humans can focus on the meaningful stuff. Like, you know, actual connection.
There are definitely ethical questions. Tuffy is trained not to answer certain high-risk questions (like how to start a union, for example), which has raised eyebrows about potential employer bias. But its creators say the goal is empowerment, not surveillance.
“We’re not replacing therapy,” KVJ emphasized. “We’re preventing things from getting to the point where therapy is the only option.”
And maybe that’s the point. AI isn’t here to replace your therapist or your boss (yet), but in the meantime? It might just help you survive them.
So, whether you’re venting to ChatGPT at 2 a.m. or prepping for a tough work convo with a digital coach named Tuffy, one thing’s clear: The bots are in the building. And they’re not just here to take your lunch order. They might be helping you get promoted, or finally quit.
Leave a Reply