AI training is a growing market in New Zealand, and like any growing market, quality varies enormously. Some providers have deep, practical experience. Others are repackaging generic content they learned from a YouTube video six months ago. The difference matters — particularly when you’re investing real money and expecting real change in how your team works.

These are the questions worth asking before you sign anything. They’re also questions we’re happy to answer about our own practice.

1. What AI tools do you use personally, every day?

A good AI trainer uses AI constantly in their own work. They should be able to name specific tools, describe what they use them for, and explain what they’ve learned about their limits and strengths from daily use. Vague answers (“I use various AI tools”) are a red flag. Specific answers (“I use Claude Pro for all my writing and analysis, ChatGPT for code, and Perplexity for research — here’s why”) indicate genuine working knowledge.

2. What have you actually built or created using AI in the last month?

This separates practitioners from theorists. A trainer who uses AI seriously should be able to point to specific outputs: a campaign they ran, a system they built, a workflow they developed, content they produced at scale. Generic answers about “helping clients” without specific examples suggest the trainer’s experience is mostly observational.

3. Is your training content specific to my industry?

Generic AI training is less valuable than industry-specific training. A workshop for accountants should use accounting examples, address accounting-specific tools, and account for accountants’ specific privacy obligations. Ask to see examples of how the training would be customised for your sector. If the answer is “the fundamentals are the same for everyone,” you’re probably getting generic content with your logo on it.

4. How current is your content? When was it last updated?

AI moves fast. Training content that was accurate twelve months ago may be significantly outdated today. New models, new tools, new capabilities, new risks — the landscape changes constantly. Ask when the training content was last substantially updated and what the update process looks like. A trainer who says “we update continuously” is more credible than one who can’t answer the question.

5. What results have your previous clients seen?

Ask for specific outcomes, not testimonials. Not “the team found it very helpful” but “the team reduced time spent on X by Y%” or “within 60 days, Z clients had implemented AI into their core workflows.” Good trainers track outcomes. If a trainer can’t give you at least a few specific examples of client results, their training may not be translating into real change.

6. How do you handle the NZ-specific context — Privacy Act, employment law, sector regulations?

Most AI training content is produced in the US or UK and doesn’t account for NZ’s specific legal and regulatory environment. The Privacy Act 2020, the Employment Relations Act, the RMA, the Health Information Privacy Code — these create real constraints on how AI should be used in NZ workplaces. A trainer who addresses these directly is doing NZ-specific work. One who doesn’t is probably repurposing overseas content.

7. What happens after the training? What support is available?

A training day without follow-through produces minimal lasting change. Research on learning consistently shows that skills stick when they’re applied, reinforced, and supported — not when they’re introduced once in a workshop and then forgotten. Ask specifically: what access do participants have after the training? Are there follow-up sessions? Is there a community? Is the trainer reachable for questions?

8. How do you measure whether the training worked?

If a trainer doesn’t have an answer to this, they probably aren’t measuring it. Good training providers want to know whether their training delivered outcomes — not just whether participants rated the day highly on a satisfaction survey (which is often misleading). Ask how they define success and how they track it.

9. Can I speak to a previous client in a similar situation to mine?

Any good provider should be able to connect you with a reference — not a curated testimonial but an actual conversation with a past client. If a trainer hesitates or can’t produce a reference, that tells you something. If they connect you readily and the reference conversation is substantive, that’s a strong signal.

10. What won’t your training help with?

This is the honesty question. Every training approach has limits — things it’s not designed for, contexts where it won’t work well, problems it won’t solve. A trainer who can articulate their limits clearly is more credible than one who claims to solve everything. The best trainers are honest about fit: they’ll tell you if what you’re looking for isn’t what they do, and direct you elsewhere. That’s the kind of provider worth working with.

How We Answer These Questions

We’re happy to answer all of the above directly. Here’s our honest take:

  • Tools we use daily: Claude Pro (writing, analysis, client communication), ChatGPT Plus (research, code, image work), Perplexity (current events, competitor research), OpenClaw (our own dedicated AI assistant for business operations).
  • Recent builds: A 34-suburb SEO content system for an AI services site, a speaking leads research and outreach system, an AI-powered client assessment pipeline, and the training content you’re reading right now — all AI-assisted.
  • Industry specificity: We customise all training content for your sector. We’ve built specific content for legal, accounting, healthcare, education, and professional services contexts in NZ.
  • NZ context: Our training explicitly covers the Privacy Act 2020, relevant professional standards, and NZ employment law as it applies to AI use.
  • Follow-through: All training participants get access to our prompt library, templates, and 30-day follow-up support. We offer ongoing coaching for clients who want it.
  • What we won’t help with: Technical AI development (building models, ML engineering), enterprise-wide transformation programmes for organisations of 500+, or industries we don’t know well enough to train effectively (advanced manufacturing, specialised scientific research).

If that sounds like what you’re looking for, let’s talk. If it doesn’t fit your situation, we’ll tell you honestly and try to point you in the right direction.


Related: