← Back to Blog
Ethics & Practice

AI in Therapy Documentation: The Ethical Questions Every Clinician Should Ask

AI note-writing tools are becoming standard in clinical practice. Before you adopt one, here are the questions you should be asking — and the answers that should satisfy you.

DJ
Dr. James Whitfield, LCSW·September 18, 2024·9 min read
AI in Therapy Documentation: The Ethical Questions Every Clinician Should Ask

The conversation about AI in therapy documentation has moved from "should we?" to "which one?" with remarkable speed. In most metropolitan areas, a majority of therapists in private practice are now using some form of AI-assisted note writing.

That's not a problem — but it does mean the ethical frameworks need to catch up.

Here are the questions every clinician should be asking before they adopt any AI documentation tool.

1. Who Has Access to My Session Data?

This is the foundational question. When audio from your sessions — or transcripts of them — passes through an AI system, who can see it?

What to look for:

  • Does the vendor have a signed BAA (Business Associate Agreement) in place? This is legally required under HIPAA for any vendor handling PHI.
  • Is audio processed ephemerally (transcribed and immediately discarded) or stored?
  • Do the vendor's employees have access to your session content for model training or quality review?

A vendor who cannot clearly answer these questions should not be handling PHI.

2. Is the AI Trained on Therapy Sessions?

Some AI tools improve over time by training on user data. If your session transcripts are being used to train models — even in anonymized form — that's a consent and privacy issue your clients never agreed to.

What to look for:

  • A clear "no training on customer data" policy in the vendor's terms of service
  • Data processing agreements that explicitly prohibit use of PHI for model training
  • Opt-out provisions if training uses any form of aggregated data

3. How Accurate Is the Transcription?

Documentation accuracy has direct clinical and legal implications. An AI that mishears "passive suicidal ideation" as something else, or attributes statements to the wrong person in a couples session, creates a record that doesn't reflect the actual session.

What to look for:

  • Accuracy claims backed by real-world testing, not just marketing language
  • Speaker diarization for couples/group sessions
  • Medical and mental health vocabulary support
  • Easy editing workflows so corrections don't require rewriting entire notes

4. Where Is My Data Stored?

Geographic location of data storage matters for compliance. US-based clinicians should confirm that PHI is stored on servers within the United States.

What to look for:

  • Data residency in the US (or your jurisdiction)
  • Encryption at rest using AES-256 or equivalent
  • Encryption in transit using TLS 1.2+
  • SOC 2 Type II certification (independent security audit)

5. What Happens When I Stop Using the Service?

Data portability and deletion rights are often overlooked until they matter. If you cancel your subscription, what happens to your session data?

What to look for:

  • Clear data deletion policy with a defined timeline
  • Export functionality so you can take your notes with you
  • Confirmation of deletion upon request

6. Am I Still the Author of the Record?

Ethically and legally, you are always the author of a clinical record — even if AI drafted it. But some AI tools generate notes that don't reflect your clinical judgment, requiring significant editing that you may not always have bandwidth for.

The risk: A clinician who rubber-stamps AI-generated notes without meaningful review may be signing off on documentation that doesn't accurately represent the session. That's an ethical problem.

What good practice looks like:

  • Review every AI-generated note before signing
  • Edit anything that misrepresents the session
  • Never sign a note you wouldn't be comfortable defending in a licensing board review

7. What Does My Licensing Board Say?

Guidance varies by state and by discipline. Some boards have issued explicit guidance on AI-assisted documentation; others haven't addressed it yet.

Steps to take:

  • Check your licensing board's website for any guidance on AI in clinical documentation
  • Review your malpractice policy — some carriers have specific requirements
  • Consult your professional association (APA, NASW, AAMFT, ACA) for ethics guidance
  • Document your informed consent process with clients

8. Have My Clients Consented?

Clients have a right to know how their session information is used. If audio is being recorded and processed by AI, that should be disclosed — ideally in writing, ideally before the first session.

A simple addition to your informed consent form covers this: "Sessions may be recorded for transcription purposes. Audio is processed by a HIPAA-covered AI system and is not retained after transcription is complete."

The Bottom Line

AI documentation tools, used responsibly, reduce administrative burden, improve note quality, and give clinicians more time for actual clinical work. The ethical framework isn't complicated:

  • PHI must be protected (BAA required, encryption required)
  • Clients must be informed
  • Clinicians remain the author of record and must exercise genuine review
  • The AI serves the clinician's judgment — not the reverse

The clinicians most at risk aren't those who adopt AI tools. They're the ones who adopt them without asking these questions first.

Spend less time on notes, more time on clients

TherapyScribe generates clinical notes from your session recordings in seconds — HIPAA-compliant and ready to sign.

Start free 14-day trial →