Worst headline ever to describe mental healthcare misconduct with AI slop. 

It’s an offensively inappropriate use of the term “triggered” in this headline. 

MIT Technology Review – Therapists are secretly using ChatGPT. Clients are triggered. Some therapists are using AI during therapy sessions. They’re risking their clients’ trust and privacy in the process. By Laurie Clarke September 2, 2025

I found this on a thread on tumblr quoting a portion of a story from the article describing a therapist that accidentally shares screen with the patient revealing that they’re typing what the patient says into ChatGPT and replies with the LLM slop output. That’s wholly offensive on multiple levels.

Also where are the people who are always ready to yell “HIPAA” at the drop of a hat when it doesn’t even apply? Because it applies here!

This describes a hugely inappropriate HIPAA violation as chatbots are NOT private and are shared with the chatbot tech company, and have been in some cases searchable online!  Healthcare providers should not under any circumstances typing patient details into a chatbot, and certainly not into an app that is not secured for health data. 

I’m tired of this just stop already.

AI therapy is a TERRIBLE idea Aug 30 2024 albertatech

Google Is Exposing Peoples’ ChatGPT Secrets 404 Media Aug 6, 2025

Ai ‘Therapist’ Told me to KILL PEOPLE! Dr. Caelan Conrad  Premiered Jul 22, 2025