AI Governance for New Zealand Government Agencies
Digital.govt.nz published the framework and the guidance. Now every agency needs to turn that into procurement criteria, risk registers, supplier assessments, and governance structures that actually work -- while meeting Treaty obligations and Government Procurement Rules. That is where most agencies get stuck.
The AI governance framework exists. The implementation gap is real.
Agencies have the guidance from Digital.govt.nz. What they lack are the operational tools to put it into practice across procurement, risk, and day-to-day use.
"We are procuring AI tools but have no evaluation criteria"
Government Procurement Rules require you to assess supplier reputation, pricing, supply chain risks, and privacy and security posture. The Responsible AI Guidance for GenAI adds further expectations around data traceability and exit strategies. Without AI-specific evaluation criteria, procurement teams are left guessing.
Build procurement evaluation criteria →"Our risk assessment process does not cover AI"
The framework calls for robust risk management proportionate to the AI use case. But most agency risk registers were not designed for algorithmic bias, model drift, or third-party AI dependencies. You need AI-specific risk categories layered onto your existing processes, not a separate system.
Develop AI risk assessment capability →"We have Treaty obligations but no AI-specific guidance"
Te Tiriti o Waitangi requires government agencies to protect Maori interests. When AI systems process Maori data, affect Maori communities, or inform decisions with Treaty implications, generic governance frameworks fall short. You need data sovereignty principles built into your AI governance from the start.
Integrate Treaty-compliant governance →Procurement Under Government Procurement Rules
AI procurement is still procurement. The Government Procurement Rules and Principles apply, but agencies need AI-specific criteria at every stage of the process.
Planning Phase
Before going to market, agencies must assess the genuine business need for AI, conduct market analysis of available solutions, identify risks specific to AI deployment, and define evaluation criteria that reflect responsible AI principles.
Supplier Evaluation
Evaluate suppliers against AI-specific criteria: data handling practices, model transparency, bias testing procedures, security certifications, supply chain dependencies, and the ability to provide audit trails for algorithmic decisions.
Contract and Exit Strategies
AI contracts must address data ownership, model portability, ongoing monitoring obligations, and exit strategies. If a vendor relationship ends, can you retrieve your data and transition to an alternative without service disruption?
Ongoing Governance
Procurement does not end at contract signing. Agencies need ongoing supplier performance monitoring, periodic risk reassessment, data traceability audits, and compliance verification against the framework and Privacy Act 2020.
Across Every Level of New Zealand Government
Different agency types face different governance pressures. Central government departments operate under Cabinet expectations. Local councils answer directly to ratepayers. Crown entities have their own accountability structures. We tailor our approach to your specific context.
Central Government Departments
Ministries and departments with direct Cabinet accountability. Subject to Public Service AI Framework expectations and all-of-government procurement rules.
Local Government and Councils
Auckland Council, Wellington City Council, and regional councils deploying AI for resource consenting, transport planning, and citizen services. Accountable to ratepayers and local communities.
Crown Entities and SOEs
Organisations like Te Whatu Ora (Health NZ), education institutions, and state-owned enterprises navigating AI governance within their sector-specific regulatory environments.
What We Deliver for Government Agencies
Practical tools that integrate with your existing processes. Not abstract frameworks that sit on a shelf.
Framework Implementation Plan
- • Phased roadmap mapped to your agency's AI maturity
- • Gap analysis against Public Service AI Framework
- • Responsible AI Guidance for GenAI operationalisation
- • Staff roles, responsibilities, and escalation pathways
AI Procurement Evaluation Criteria
- • Supplier assessment scorecards for AI vendors
- • Due diligence checklists aligned to Procurement Rules
- • AI-specific contract clauses and SLA templates
- • Exit strategy and data portability requirements
Treaty Compliance Governance
- • Maori data sovereignty principles embedded in AI policy
- • Iwi and hapu consultation frameworks for AI deployment
- • Cultural impact assessment templates
- • Te Tiriti alignment documentation
Agency Risk Assessment Framework
- • AI risk categories integrated into existing risk registers
- • Use-case-level risk classification methodology
- • Proportionate controls based on risk tier
- • Ongoing monitoring triggers and review schedules
Data Traceability Protocols
- • Data flow mapping for each AI system
- • Offshore data transfer documentation
- • Retention schedules aligned to Public Records Act
- • Vendor data handling verification procedures
Cross-Agency Coordination
- • Shared governance standards for multi-agency AI projects
- • Interoperability requirements for data sharing
- • Joint procurement governance for common-use AI
- • Knowledge-sharing frameworks between agencies
Common Questions From Government Agencies
Is the Public Service AI Framework mandatory?
The framework itself is guidance, not regulation. However, Cabinet expectations, Ministerial oversight, and public accountability mean agencies are expected to demonstrate they are following it. An agency that deploys AI without reference to the framework faces significant reputational and political risk. The Responsible AI Guidance for GenAI further strengthens these expectations for generative AI specifically.
How do Government Procurement Rules apply to AI purchases?
AI procurement falls under the same Government Procurement Rules and Principles as any other procurement. The planning phase requires business need assessment, market analysis, risk mitigation planning, and evaluation criteria. What is different is that AI introduces additional risks around data sovereignty, algorithmic transparency, and vendor lock-in that procurement teams need AI-specific criteria to evaluate.
What are our Treaty of Waitangi obligations for AI?
Government agencies have constitutional obligations under Te Tiriti o Waitangi. When AI systems process data about Maori, inform decisions that affect Maori communities, or are deployed in contexts with Treaty implications, agencies need to ensure Maori data sovereignty principles are respected. This includes meaningful consultation with iwi and hapu, not just a checkbox exercise.
How do we coordinate AI governance across multiple agencies?
Cross-agency AI projects, shared platforms, and all-of-government procurement arrangements all need coordinated governance. This means agreeing on shared risk assessment standards, data sharing protocols, vendor management responsibilities, and accountability for outcomes. Without explicit coordination, each agency creates its own approach, leading to inconsistency and gaps.
What should our AI exit strategy look like?
The framework and Government Procurement Rules both emphasise exit planning. For AI, this means ensuring you can retrieve your data, that models are not so tightly integrated that switching vendors causes service disruption, and that you retain ownership of any fine-tuned models or training data. Exit strategies should be defined before contract signing, not after a vendor relationship deteriorates.
Your Agency Has the AI Governance Framework. We Help You Operationalise It.
Book an implementation workshop to map the Public Service AI Framework to your agency's specific context -- your procurement pipeline, your risk appetite, your Treaty obligations, and your existing governance structures.