Mental health practice in New Zealand carries some of the most significant confidentiality obligations of any profession. Clients share their most vulnerable experiences in the expectation of complete discretion. At the same time, psychologists, counsellors, and psychotherapists face real administrative demands — session notes, ACC reporting, client resources, professional development, and practice management all consume time that could otherwise be spent with clients.

AI can meaningfully reduce that administrative load — but only when used with a clear understanding of what mental health data privacy requires.

Where AI Adds Real Value

1. Session Notes and Progress Documentation

Clinical notes — session summaries, treatment progress entries, risk assessments, and case conceptualisations — follow predictable structures for common presentations. AI can help build templates for specific modalities (CBT, ACT, DBT, Schema Therapy) and common presenting issues (anxiety, depression, trauma, relationship difficulties) that accelerate documentation without reducing quality.

The workflow: after a session, use your own words to describe what occurred in general clinical terms — themes, interventions used, client response, homework set — without identifying details. AI structures this into a professional note. You add the client-specific detail directly in your clinical records system, not in the AI tool.

2. ACC Mental Health Reporting

ACC-registered psychologists managing sensitive claim treatment (SCT) have significant documentation requirements — initial assessments, treatment plans, progress reports, and closure summaries. AI can help structure these documents faster from your clinical notes, particularly for the narrative sections that require clear, evidence-based writing.

As with all ACC documentation in health: de-identify when drafting with AI, verify clinical accuracy, and ensure the registered clinician takes full responsibility for every submitted report.

3. Client Psychoeducation Resources

Psychoeducation handouts, between-session worksheets, thought records, behavioural activation schedules, sleep hygiene guides, grounding technique cards, and self-compassion exercises can all be drafted with AI. Creating these materials from scratch is time-consuming; AI can produce a solid first draft that you review and refine for clinical appropriateness and your therapeutic style.

Better client resources improve between-session engagement and therapeutic outcomes — and AI makes it practical to create more of them.

4. Referral Letters and Multidisciplinary Communications

Referrals to psychiatrists, GPs, specialist services, or other therapists need to balance comprehensive clinical information with client confidentiality. AI can help structure these letters quickly from de-identified clinical notes. You finalise with the specific clinical detail and any disclosures you and the client have agreed to.

5. Practice Policies and Informed Consent Documentation

Informed consent forms, confidentiality agreements, cancellation policies, telehealth consent documents, and AI use policies for your practice can all be drafted efficiently with AI. These documents matter for both ethical practice and professional liability — having them professionally written reduces risk.

6. CPD and Professional Development

Psychology and counselling have meaningful CPD requirements. AI can help you research evidence-based approaches, summarise clinical literature, prepare supervision case presentations, and structure reflective practice entries. It’s a research and writing accelerator — not a clinical supervisor.

7. Practice Marketing and Destigmatisation Content

Blog posts explaining common conditions, social media content normalising help-seeking, and website copy that communicates your therapeutic approach — AI can help you create this content consistently. Mental health professionals who share educational content build community trust and attract clients who are a good fit for their approach.

Mental Health Data: The Strictest Privacy Obligations in Practice

Mental health information is treated with special sensitivity under the NZ Privacy Act 2020 and the Health Information Privacy Code. It includes not just diagnoses, but anything disclosed in a therapeutic context — trauma history, relationship information, financial stress, substance use, suicidal ideation, and more.

This is not data to be casual about. The rules:

  • Never enter any client information into consumer AI tools — not names, not presenting issues, not session content, not diagnoses. Not even anonymised details that could identify a client in a small community.
  • The de-identification standard is higher for mental health than most health sectors. New Zealand communities are small. “A 35-year-old teacher in Dunedin dealing with a marriage breakdown” may be identifiable to someone who knows your client.
  • If you use AI transcription for sessions, this requires explicit informed consent, a clear explanation of where data goes, and likely a formal data processing agreement. Most general transcription AI tools are not appropriate for therapy sessions without enterprise-level privacy controls.
  • Your ethical codes — NZAP, NZAC, NZPsS, or the Psychologists Board standards — set confidentiality obligations that go beyond the Privacy Act. AI use must be consistent with those obligations.
  • A local AI setup (such as OpenClaw) that runs entirely on your own hardware is the only configuration where session-related content could safely be used with AI assistance.

The safest approach: keep AI entirely separate from client-identifiable content. Use it for templates, resources, research, marketing, and policy documents — all of which can be done without any client data involved.

What AI Cannot Do in Therapeutic Practice

AI cannot provide therapy. It cannot hold a therapeutic relationship, attune to a client’s emotional state, manage risk in a crisis, or provide the human presence that is itself a healing element of psychological work.

There are AI “therapy” products emerging internationally. These are not a substitute for registered professional care — and in New Zealand, providing psychological services requires registration with the Psychologists Board or membership of a recognised professional body. The therapeutic relationship and clinical judgment remain entirely human.

Getting Started

The lowest-risk, highest-value starting point: use AI to create or improve your client psychoeducation materials. Pick one handout you currently use — a breathing exercise, a thought record, a sleep hygiene guide — and ask AI to improve it: plainer language, better structure, more empathic tone. No client data involved, immediate clinical value.

From there, build note templates for your three most common presenting issues — using de-identified clinical language, with no client content — and test whether they speed up your documentation.

For a structured approach to AI in your practice — including a privacy-compliant AI use policy and staff or associate training — an AI Assessment provides a clear, ethical roadmap. We work with health and mental health practices across New Zealand.

Frequently Asked Questions

Can I use AI to help write my session notes?

You can use AI to build note templates and structure your documentation — but client-identifying content must never go into consumer AI tools. The practical approach: build templates with AI (no client data), then complete them with client-specific clinical content in your records system. The documentation quality can improve while the privacy risk stays zero.

What do the Psychologists Board and NZAC say about AI?

Neither body has issued specific AI guidance as of early 2026, but both have clear standards on confidentiality and professional responsibility that apply to any tool or technology used in practice. Expect formal guidance to emerge as AI use becomes more widespread. In the meantime, apply the highest standard: if you wouldn’t share the information with a stranger, don’t share it with an AI tool.

Is AI-assisted therapy appropriate to offer clients?

Using AI tools as between-session resources for clients (apps, exercises, psychoeducation) is different from using AI as a therapeutic intervention. The former — carefully selected, clinician-recommended tools as supplements to therapy — is an emerging area with a developing evidence base. The latter — presenting AI interaction as therapy — is not appropriate for registered practitioners in New Zealand.

I work in a group practice — how do we set AI policy?

A group practice needs a written AI use policy that all practitioners follow — specifying which tools are approved, what client data may never be used with AI, how AI-assisted content must be reviewed, and what the process is if a practitioner has concerns. This protects both clients and practitioners. We can help you draft this as part of an AI Assessment.