Therapists Face Backlash for Secretly Using ChatGPT in Client Sessions
What Happened
An increasing number of therapists have begun using OpenAI\’s ChatGPT as an aid in client counseling sessions, sometimes without disclosing this to patients. According to an MIT Technology Review report, some clients felt betrayed or triggered upon discovering that AI was used without their explicit consent. The practice raises questions around patient privacy and transparency, as well as the clinical appropriateness of using AI-generated responses during mental health care. While some therapists cite efficiency and creative support as benefits, the lack of clear guidelines and protections for sensitive health data is concerning both for practitioners and clients.
Why It Matters
The secretive use of ChatGPT in therapy sessions highlights emerging ethical dilemmas in AI adoption within sensitive domains like mental health. The incident underscores the pressing need for regulatory guidance and professional standards to ensure confidentiality, informed consent, and responsible AI use in healthcare. Read more in our AI News Hub