A group therapy practice in Colorado discovered earlier this year that its AI note-taking tool had been storing unencrypted session transcripts on servers outside the United States for over eight months. The tool had no Business Associate Agreement in place. The practice only found out when a client asked for a copy of their records and the therapist realized the transcripts were being hosted by a third-party subprocessor the practice had never heard of. The remediation cost exceeded $95,000 in legal fees, forensic analysis, and mandatory breach notifications.

That practice is not an outlier. The HHS Office for Civil Rights (OCR) has sharpened its enforcement focus in 2025 and 2026, targeting AI tools that handle protected health information (PHI), discrimination risks from algorithmic decision-making, and cybersecurity gaps at small healthcare practices. Meanwhile, 21% of professional service firms suffered a cyberattack in the past year -- and therapy practices, which handle some of the most sensitive data in healthcare, are squarely in the crosshairs.

HIPAA fines range from $50,000 to $1.5 million per violation category per year. For a solo therapist or small group practice, a single compliance failure involving an AI tool can threaten the entire business. This guide walks you through exactly how to use AI compliance tools to protect your practice, your clients, and your license.

The Privacy Landscape for Therapists in 2026

Therapy practices face a unique compliance challenge. You handle some of the most sensitive information in all of healthcare -- substance abuse records, mental health diagnoses, relationship details, trauma histories -- and your clients trust you with data they may not share with anyone else. That trust carries a legal weight that goes beyond standard HIPAA obligations.

Here is what has changed in 2025-2026 that directly affects your practice:

  • HHS/OCR enforcement expansion now specifically targets AI tools that create, receive, maintain, or transmit electronic PHI (ePHI), with a focus on discrimination risk and unauthorized data use
  • Proposed HIPAA Security Rule updates require stronger encryption standards, more rigorous risk management documentation, and faster incident response timelines for all covered entities regardless of size
  • 42 CFR Part 2 alignment with HIPAA changed how substance use disorder records are handled, creating new compliance considerations for therapists treating addiction
  • State-level mental health privacy laws in California, New York, Colorado, and others add requirements on top of federal HIPAA rules
  • AI-specific guidance from HHS warns that using AI tools without proper safeguards constitutes a failure to conduct adequate risk analysis under the HIPAA Security Rule

The regulatory message is clear: if you use AI tools in your therapy practice -- for note-taking, scheduling, billing, client communication, or anything else that touches PHI -- you are responsible for ensuring those tools meet HIPAA requirements. The vendor's marketing claims are not enough. You need to verify.

HIPAA applies to every therapist who transmits health information electronically, which in 2026 means virtually every practice. There is no small-practice exemption. A solo therapist using an AI transcription tool faces the same compliance requirements as a hospital system.

Step 1: Audit Every AI Tool Touching Client Data

Before you can fix compliance gaps, you need to know where they exist. Most therapists underestimate how many AI tools in their workflow actually handle PHI. The audit needs to cover everything.

Start by listing every piece of technology in your practice that processes client information:

  • AI note-taking and transcription tools (Otter, Freed, Mentalyc, or similar) that record, transcribe, or summarize therapy sessions
  • EHR and practice management systems that store client records, treatment plans, and billing data
  • Scheduling and intake platforms that collect client demographics, insurance information, and presenting concerns
  • Client communication tools including secure messaging, telehealth platforms, and any AI chatbots handling client inquiries
  • Billing and claims processing tools that transmit diagnosis codes and treatment information to insurers
  • Any general-purpose AI tools you or your staff have used with client information, even informally

For each tool, document three things: what PHI it accesses, where that data is stored, and whether the vendor has signed a Business Associate Agreement with your practice. If any tool accesses PHI without a BAA, that is your most urgent compliance gap.

AI compliance monitoring platforms automate this audit by scanning your practice's technology stack and identifying every data flow that involves PHI. What takes a solo therapist 20 to 30 hours to do manually -- mapping data flows, reviewing vendor contracts, checking encryption settings -- takes an AI tool 24 to 48 hours to complete with significantly fewer blind spots.

Step 2: Lock Down Business Associate Agreements

A Business Associate Agreement is not optional. Under HIPAA, any vendor that creates, receives, maintains, or transmits ePHI on your behalf must sign a BAA before they touch any client data. No BAA means no compliance -- period.

This is where many therapists run into trouble with AI tools. The vendor's website may claim HIPAA compliance, but claims are not contracts. You need a signed BAA that specifies:

  • Permitted uses and disclosures of PHI, limited to the specific services the vendor provides to your practice
  • Safeguards the vendor will implement, including encryption standards, access controls, and audit logging
  • Breach notification obligations requiring the vendor to notify you within a specific timeframe (ideally 24-48 hours) after discovering a breach
  • Subcontractor requirements ensuring the vendor imposes the same BAA obligations on any subprocessor that handles your data
  • Data return and destruction terms specifying what happens to your PHI when the contract ends
  • Prohibition on using PHI for model training -- this is critical for AI vendors, as some use customer data to improve their algorithms unless explicitly prohibited

If your AI note-taking vendor, your telehealth platform, or your scheduling tool will not sign a BAA, you cannot use them with client data. There is no workaround. Some therapists try to anonymize data before inputting it into AI tools, but partial anonymization frequently fails to meet the HIPAA de-identification standard, which requires removal of 18 specific identifiers.

Dynalord builds AI systems for therapy practices with HIPAA compliance built in from day one. Every tool we deploy includes a signed BAA, end-to-end encryption, and zero data retention for model training.

Get Your Free AI Report

Step 3: Verify Technical Safeguards Before You Deploy

A signed BAA is necessary but not sufficient. You also need to verify that the AI vendor's actual technical infrastructure meets HIPAA requirements. The proposed Security Rule updates make this even more important by raising the bar on what constitutes adequate safeguards.

Here are the non-negotiable technical requirements for any AI tool handling therapy practice data:

  • End-to-end encryption for all data in transit (minimum TLS 1.2) and at rest (AES-256), covering session recordings, transcripts, notes, and any intermediate processing
  • Auto-delete audio after transcription -- AI transcription tools should delete raw audio files from their servers immediately after generating the transcript, not retain them for quality improvement or model training
  • No data used for model training -- the vendor must contractually confirm that your client data is never used to train, fine-tune, or improve their AI models
  • SOC 2 Type II certification from an independent auditor within the past 12 months, demonstrating that the vendor's security controls are tested and effective over time
  • Role-based access controls ensuring only authorized clinicians can access specific client records
  • Audit logging that records every access to PHI, including who accessed it, when, and what they did with it
  • Data residency controls keeping PHI within the United States (or your required jurisdiction) at all times, including during processing

Request proof, not promises. Ask the vendor for their most recent SOC 2 Type II report, their encryption architecture documentation, and their data flow diagram showing exactly where your PHI goes during processing. Legitimate vendors will provide these without hesitation. If a vendor pushes back on transparency requests, that tells you everything you need to know.

The HHS cybersecurity guidance for healthcare provides additional technical benchmarks that apply to therapy practices of all sizes. The proposed Security Rule updates strengthen these requirements further, making early adoption of strong safeguards a smart investment.

Step 4: Set Up Continuous Compliance Monitoring

A one-time audit is not enough. HIPAA requires ongoing risk management, and the regulatory environment changes multiple times per year. AI compliance monitoring tools run continuously in the background, watching for problems before they become violations.

A continuous compliance monitoring system for therapy practices tracks:

  • Vendor security posture changes -- if your EHR vendor's SOC 2 certification lapses or they change their data processing architecture, you get an alert
  • Access anomalies such as logins from unusual locations, bulk record access, or attempts to export client data outside normal workflows
  • Policy drift where your actual data handling practices diverge from your written HIPAA policies and procedures
  • Regulatory updates including new HHS guidance, state law changes, and proposed rule modifications that affect your compliance obligations
  • BAA expiration tracking ensuring every vendor agreement stays current and no gaps develop
  • Staff training compliance verifying that all clinicians and administrative staff complete required HIPAA training on schedule

The monitoring system categorizes alerts by severity. A lapsed BAA or unencrypted PHI transmission triggers an immediate alert. A staff member overdue for annual HIPAA training generates a lower-priority reminder. Everything feeds into a compliance dashboard that gives you a real-time view of your practice's risk posture.

For therapists already using AI tools for clinical work, this monitoring layer is essential. Each AI tool that processes PHI introduces a new attack surface. Monitoring ensures that your compliance posture keeps pace as your technology usage grows. Practices that also serve optometry or other healthcare specialties will find similar frameworks covered in our guide on AI compliance and data security for optometrists.

Not sure if your current AI tools meet HIPAA requirements? Dynalord audits your practice's technology stack and identifies compliance gaps before HHS does. No obligation, no email spam.

Get Your Free AI Report

Step 5: Ban Consumer AI Tools From Your Practice

This step is simple to state and critical to enforce: never use consumer versions of ChatGPT, Gemini, Claude, or any general-purpose AI tool with client information. Not for drafting treatment plans. Not for summarizing session notes. Not for composing emails to clients. Not for anything involving PHI.

Consumer AI tools fail HIPAA requirements in multiple ways:

  • No Business Associate Agreement -- consumer-tier AI services do not sign BAAs with individual users
  • Data retention for training -- many consumer AI tools retain user inputs to improve their models, meaning your client's therapy content could become training data
  • No encryption guarantees that meet HIPAA standards for PHI in transit or at rest
  • No audit logging to track what data was submitted or how it was processed
  • No breach notification obligations to your practice if their systems are compromised

Enterprise versions of these same AI platforms often do offer BAAs, HIPAA-grade encryption, and data isolation. The distinction between consumer and enterprise tiers is the difference between compliance and violation. If you want to use a major AI platform for clinical work, you must subscribe to the enterprise or healthcare-specific tier and execute a BAA before entering any PHI.

Create a written policy for your practice that explicitly lists which AI tools are approved for use with client data and which are prohibited. Have every clinician and staff member sign it. An AI compliance monitoring tool can enforce this policy by flagging unauthorized data flows to consumer AI platforms.

The FTC has warned AI companies that privacy claims must be substantiated, but enforcement lags behind the speed at which therapists are adopting these tools. Your compliance responsibility does not depend on whether the FTC has caught up with a specific vendor.

Step 6: Build a Breach Response Plan

Every therapy practice needs a breach response plan before a breach occurs. Under HIPAA, you have 60 days from discovery to notify affected individuals, and if 500 or more people are affected, you must also notify HHS and local media. An AI-powered breach response system compresses your response timeline from days to hours.

When a potential breach is detected, an AI breach response tool:

  • Identifies the scope by scanning which client records were accessed, exfiltrated, or exposed
  • Assesses the risk using the four-factor test from the HHS Breach Notification Rule: nature of the PHI, who accessed it, whether it was actually acquired or viewed, and the extent of risk mitigation
  • Generates notification letters that meet HIPAA content requirements for individual breach notifications
  • Tracks notification deadlines and sends escalating reminders as the 60-day window closes
  • Produces a complete incident report for HHS filing, including timeline, scope, remediation steps, and supporting evidence

The proposed HIPAA Security Rule updates tighten incident response requirements further. Practices that build AI-powered breach response systems now will be ahead of these changes when they take effect.

Speed directly affects cost. Organizations that contain a breach within 200 days spend significantly less than those that take longer. For therapy practices, where client trust is the foundation of the therapeutic relationship, the reputational damage from a slow or mishandled breach response can be more devastating than the fines themselves.

Cost Comparison: Compliance Tools vs. HIPAA Fines

The math on AI compliance tools is straightforward when you compare the cost of prevention against the cost of violations. Here is a realistic breakdown for a small therapy practice with 3 to 5 clinicians.

AI compliance approach (annual cost):

  • HIPAA-compliant AI note-taking tool with BAA: $1,200 - $2,400 per clinician per year
  • AI compliance monitoring platform: $1,800 - $4,800 per year
  • Annual risk assessment (AI-assisted): $2,000 - $5,000
  • Staff HIPAA training (automated): $500 - $1,500
  • Total: $5,500 - $13,700 per year for a 3-clinician practice

Cost of non-compliance:

  • HIPAA fine for willful neglect (corrected): $10,000 - $50,000 per violation
  • HIPAA fine for willful neglect (not corrected): up to $1.5 million per violation category per year
  • Average healthcare data breach cost: $10.93 million across the organization (per IBM's 2024 Cost of a Data Breach Report)
  • Mandatory corrective action plan: 1-3 years of HHS oversight
  • Potential loss of licensure and malpractice liability
  • Client attrition from damaged trust: difficult to quantify, often permanent

Spending $6,000 to $14,000 per year on AI compliance tools protects a therapy practice against fines starting at $50,000 per violation. The investment pays for itself the moment it prevents a single reportable incident. Factor in the proposed Security Rule changes that will raise the compliance bar further, and early adoption becomes even more cost-effective.

Practices already investing in AI tools to improve client responsiveness -- such as AI chatbots for faster client response -- should budget for compliance monitoring as a standard line item alongside the tools themselves. The two go together. Check Dynalord pricing for bundled options that include compliance-ready AI systems built for healthcare practices.

Dynalord builds HIPAA-compliant AI systems for therapy practices that include continuous compliance monitoring, signed BAAs, and zero-retention data policies. Get protected before the proposed Security Rule changes take effect.

Get Your Free AI Report

Frequently Asked Questions