Does the Office of the Privacy Commissioner approve or endorse Artificial Intelligence providers?
No, we don’t approve or endorse the use of any AI technology or AI technology providers. Agencies should do a privacy assessment before using AI tools for personal information.
The Information Privacy Principles (IPPs) set out legal requirements on how agencies (businesses or organisations) collect, use, and share personal information, including when using AI tools. If you are an agency, thinking about using an AI tool for personal information, we have issued guidance which will help you work through the privacy issues involved. Our Artificial Intelligence and the IPPs guidance is available on our website. These tools may involve collecting, using, or sharing personal information in ways that are not immediately obvious to you. Our guidance includes key questions you should ask yourself about each stage of building and/or using AI tools.
Before you use AI tools for personal information, you need to understand enough about how they work to be confident you are complying with the IPPs. The best way to do this is to do a Privacy Impact Assessment (PIA) before you start using AI tools, and to update it regularly. You can do this using our Privacy Impact Assessment Toolkit, which explains how and why to do a PIA and includes a template document to guide your thinking.
Knowing what personal information you plan to put into an AI tool is vital to assessing the privacy risk through your PIA. The know your personal information pou in the Poupou Matatapu guidance is a great place to start to work out what personal information you are collecting, using, storing and disclosing. From there you can assess the privacy impacts of your proposed use of AI.
If you have not assessed the privacy impacts or have doubts about whether your use of AI tools comply with the IPPs, we recommend you do not use AI tools to handle personal information.
Updated November 2025