Financial advisers in New Zealand work at a unique intersection: technically demanding work, significant regulatory obligations, and deeply sensitive client data. AI can genuinely help with the first two. The third requires getting the data handling right.
This guide covers where AI adds real value for NZ financial advisers — and where the FMA, Privacy Act, and professional obligations mean you need to be careful.
Where AI Genuinely Helps Financial Advisers
Research and Market Intelligence
AI accelerates research dramatically. Summarising fund manager reports, synthesising market commentary, explaining complex financial instruments in plain language for client communications — these are tasks where AI saves hours without requiring sensitive client data.
Keeping up with regulatory changes is also faster. AI can summarise FMA guidance updates, explain FSLAA requirements, and help advisers stay current on changes that affect their practice.
Client Communications and Reporting
Writing Statements of Advice, portfolio review letters, and market update communications takes significant time. AI drafts these faster — and with appropriate inputs, produces clearer, more accessible language than many advisers manage under deadline pressure.
The key is giving AI the right context (portfolio performance, client circumstances, relevant market events) while handling actual client data carefully. More on this below.
Compliance Documentation
The compliance burden on NZ financial advisers post-FSLAA is real. AI helps with: drafting and reviewing disclosure documents, internal policy documentation, training records, and compliance frameworks. It won’t replace your compliance lawyer for complex questions, but it significantly reduces the drafting burden.
Client Education Materials
Explaining investment concepts, risk profiles, and product characteristics in plain English — for clients with varying financial literacy — is time-consuming work AI does well. Newsletter content, explainer documents, and FAQ materials are strong use cases.
Internal Processes and Administration
Meeting notes, action item summaries, onboarding checklists, and standard operating procedures all benefit from AI assistance. The administrative layer of a financial advice practice — often handled by the adviser themselves in smaller firms — is where AI saves hours every week.
The Data Privacy Challenge
Financial advisers hold some of the most sensitive personal information that exists: income, assets, liabilities, KiwiSaver balances, insurance coverage, wills and estate plans, family circumstances. This data is protected under the NZ Privacy Act 2020 and subject to FMA obligations around client confidentiality.
The risk with consumer AI tools (free ChatGPT, Claude.ai browser, etc.) is that entering identifiable client information into these systems may constitute a privacy breach — especially on free tiers without enterprise data processing agreements.
Practical rules for financial advisers:
- Never enter client names, IRD numbers, account numbers, or portfolio values into consumer AI tools. Even with good intentions, this creates compliance exposure.
- Anonymise before using AI. “A 58-year-old client with a $400K KiwiSaver balance and moderate risk profile” rather than identifying details.
- Use enterprise agreements or local AI for anything with real client data. Enterprise plans include data processing agreements; local AI (like OpenClaw) keeps data on your hardware entirely.
- Document your AI use policy. The FMA expects advisers to have policies covering their tools and processes. AI is a tool — it needs to be in your policy framework.
Regulatory Context: FMA and FSLAA
The Financial Services Legislation Amendment Act 2019 (FSLAA) and the FMA’s conduct obligations don’t specifically address AI — but the principles apply clearly.
Key obligations relevant to AI use:
- Client-first obligation. AI outputs that influence advice must be reviewed for accuracy and suitability. You can’t delegate the “client best interests” test to a model.
- Fair dealing. Any AI-drafted client communication must be accurate and not misleading. Hallucinations in financial content carry real regulatory risk.
- Competence, knowledge and skill. Using AI tools effectively is increasingly part of professional competence — but so is knowing their limitations.
- Record keeping. If AI assistance materially shaped advice given, consider whether that should be documented.
The FMA has not issued specific AI guidance as at early 2026, but the existing conduct principles apply. When in doubt, treat AI outputs as you would any other source: verify, apply professional judgment, and document.
AI Tools for NZ Financial Advisers
General-purpose AI tools most commonly used:
- Claude (Anthropic) — strong at long-form writing, compliance analysis, and explaining complex concepts in plain English. Popular for Statement of Advice drafting.
- ChatGPT (OpenAI) — widely used, broad capability. Enterprise plan required for client data handling.
- Microsoft Copilot — integrates with Microsoft 365 tools many practices already use. Enterprise data protection built in for M365 subscribers.
For advisers handling highly sensitive client portfolios, local AI running on dedicated hardware removes the cloud data question entirely — your conversations and context never leave your office.
Practical Starting Points
For financial advisers beginning their AI journey:
- Start with no-data tasks. Market summaries, product explainers, general compliance research, and email templates require no client data and carry no risk. Get your workflow established here first.
- Draft your AI use policy. One page is enough to start. What tools are approved, what data can go into them, who reviews AI-assisted outputs. Update as you learn.
- Choose your tools based on your data handling needs. If you’ll use AI for anything touching client-specific data, ensure you have an appropriate enterprise agreement or use a local solution.
- Build the habit of verification. Check AI outputs against actual product information and regulatory guidance before client use. Every time.
- Invest in proper training. AI tools are only as useful as the judgement of the person using them. Structured training for financial services professionals beats learning by trial and error in a regulated environment.
Frequently Asked Questions
Can financial advisers use AI in New Zealand?
Yes — with appropriate safeguards. AI is well-suited to research, drafting, and administrative tasks. The key constraints are client data privacy (Privacy Act 2020) and the need to apply professional judgment to any AI output that influences advice.
Is AI-generated financial advice compliant with FMA requirements?
AI assists the adviser; it doesn’t replace them. Statements of Advice and personalised recommendations must reflect the adviser’s professional judgment, not unchecked AI output. AI can accelerate the drafting process but the compliance obligation stays with the adviser.
Can I put client portfolio data into ChatGPT?
Not on consumer/free plans. Enterprise plans with data processing agreements are appropriate for client-specific data. Local AI running on dedicated hardware is another option that removes the cloud data exposure question entirely.
Does the FMA have guidance on AI use?
As at March 2026, the FMA has not issued specific AI guidance for financial advisers. Existing conduct obligations — client-first duty, fair dealing, competence requirements — apply to AI-assisted work as they do to any other process.
Ready to build genuine AI capability in your financial advice practice? Talk to us about training tailored to NZ financial services professionals.




