Office of the Privacy Commissioner | Privacy Commissioner's keynote speech, IAPP Australia New Zealand Summit 2025
The Privacy Commissioner, Michael Webster, spoke at the IAPP (International Association of Privacy Professionals) Australia New Zealand Summit on 3 December 2025. Read his full keynote speech below.
Introduction
As always, it’s a privilege to be here this morning to talk about the focus of my Office, now and in the future, in protecting individuals’ right to privacy of personal information – and in helping organisations to do privacy well.
I have had a number of opportunities recently to meet with fellow Privacy Commissioners and heads of data protection agencies, and it seems we are all facing similar challenges and opportunities … including very existential ones like the nature of privacy regulation.
Are we meant to be modern regulators, smart regulators, risk-based regulators, outcome- or performance-based regulators, customer-centric regulators … the list goes on.
I am quite happy to be a reasonable regulator … and this morning I want to talk about what that has meant for the New Zealand Office of the Privacy Commissioner this year, and for our future work and direction.
As always, though, I’d like to start with a bit of an overview of the state of privacy in New Zealand – it provides you all with useful context.
OPC Privacy Survey 2025
And let’s start with some headline results from our 2025 privacy survey – it’s available on our website privacy.org.nz:
- 49% are more concerned about privacy issues over the last few years.
- Looking at that another way, 66% agreed that protecting personal privacy is a major concern.
- 67% would consider changing service providers if they heard they had poor privacy and security practices
- 67% concerned about the privacy of children, including when they use social media
- 62% concerned about government agencies or businesses using AI to make decisions about them, using their personal information
- 77% think the Privacy Commissioner should have the power to ask a Court to issue a large fine for a serious privacy breach that an agency had caused either intentionally or due to negligence
- 77% think the Privacy Commissioner should have the power to audit the privacy practices of a business or government agency
- 65% are willing to see an increased use of privacy intrusive technology if it reduces theft
- 64% are willing to see an increased use of privacy intrusive technology if it increases personal safety, and
- 82% agreed they want more control and choice over the collection and use of their personal information.
As with me, I am sure these results prompt a number of questions in your minds, including:
- How damaging to an organisation could a personal information privacy breach be, in terms of lost trust and confidence and lost customers – and given that, what are the tools, tips and techniques for avoiding – or at least responding well – to a privacy breach?
- How can we best balance public good goals around things like public safety, and privacy? and
- How can agencies lift their games in terms of providing assurance to citizens and customers that their right to privacy is being taken seriously?
The challenge of lifting our games becomes all the more real if you look at the stats in our latest Annual Report for the 2024/25 year.
“Privacy complaints are up 21% from 2023/24, which was also a record year.
And the number of serious privacy breaches notified by organisations rose 43% this year.
In addition to doing our best to respond to these individual complaints, and individual agency breach notifications, over the last year we have focused on what we call our one-to-many activities, including:
- Developing and then issuing a Biometric Processing Privacy Code to ensure clarity and a legal framework around the automated use of biometric technologies.
- We found that the live Facial Recognition Technology model trialled by one of our major supermarket chains, Foodstuffs North Island, was compliant with the Privacy Act 2020.
- Te Ranga Tautiaki, our Māori Reference Panel, was established this year. We are now fortunate to work with a group of experts who bring a te ao Māori, or Māori world view, perspective to our work.
- We’ve prepared comprehensive guidance on sharing information to protect the wellbeing and safety of children and young people.
- And of course, Privacy Week 2025 was a highlight for us. More than 8,500 viewers watched 21 free webinar options with topics like AI, children’s privacy, and Māori data sovereignty.
Office of the Privacy Commissioner
And what about my Office? Where to next for New Zealand's privacy regulator?
As some of you will have heard me say before, my Office’s purpose is ensuring that privacy is a core focus for organisations, in order to protect the privacy of individuals, enable agencies to achieve their own objectives, and safeguard a free and democratic society.
This year, we set ourselves three key areas of strategic focus.
First, provide guidance, and develop processes, to support the implementation of legislative and regulatory privacy initiatives.
In particular, the Privacy Amendment Bill introduced that new notification requirement for agencies, the Customer and Product Data Bill established new rights for individuals to share their data in specified sectors, and the Biometric Processing Privacy Code clarified and strengthened protections around the use of biometric technologies.
Our work, particularly through preparing and promulgating guidance, provided agencies with the confidence to meet the new requirements.
Second, engage with agencies to build their privacy capability and empower New Zealanders to assert their privacy rights.
We worked to develop the capability of agencies to do privacy well through the promulgation of our guidance (such as Poupou Matatapu – Doing Privacy Well), and working directly with selected agencies on projects that had a significant privacy impact (such as the facial recognition technology inquiry we ran alongside the FRT trial run by one of New Zealand’s biggest supermarket chains, Foodstuffs North Island).
Third, focus our activities on the technological and digital innovations being adopted by organisations and businesses.
In New Zealand the public and private sectors – to greater or lesser extents – are adopting innovative technologies to help them achieve their objectives.
Our Office as the privacy regulator has needed to understand and respond to these changes, such as the widespread adoption of artificial intelligence tools and the retail sector’s use of technology to respond to theft and safety issues.
We have also continued to advocate for a specific set of amendments to ensure that New Zealand’s Privacy Act is fit-for-purpose in the digital age.
Next year is the year we need to develop and sign-off on our updated medium-term strategy - in the jargon of the New Zealand public management system, our next Statement of Intent.
For me, the drive will be to ensure we find and maintain a good balance between compliance and expectations-setting activities.
I’m not thinking of moving away from our current purpose statement, but we have an opportunity to refresh and refocus our strategic objectives.
While this conversation is still going on, my own view is that I want a New Zealand privacy regulator that:
- provides ideas, advice and guidance to government and agencies, to promote and protect individual privacy,
- holds agencies accountable for protecting individuals’ right to privacy of personal information,
- takes a te ao Māori perspective on privacy, incorporating tikanga – or Māori custom - where appropriate, and
- advocates for a society which gives people freedom and autonomy to speak, think and act without unnecessary surveillance and monitoring.
Now, that’s what I think, but if you – as a member of the privacy ecosystem – have a burning sense that my Office should be focusing on, or working on, something else, please get in touch and share your thoughts.
Surveillance
Talking of areas of current focus, it goes without saying that we are all going about our lives at a time of increased surveillance.
And when I refer to surveillance, I mean both business surveillance and state surveillance.
As noted earlier, two major areas of focus in New Zealand have been the FSNI supermarket trial of the use of facial recognition technology, or FRT, and our parallel Inquiry into the trial, and the Biometric Processing Privacy Code issued by my Office.
It’s probably useful to note at this point that, unlike other parts of the world, New Zealand’s privacy law does not depend on consent as the primary authority for collecting, using, and disclosing personal information.
Consent certainly has a role, but the main driver is the legitimate business purpose of the holder of the information.
Our overall finding was that the live FRT operating model deployed by FSNI during the trial complied with the Privacy Act.
There are several key features of FSNI’s operational model that enabled us to come to this conclusion, and I think it’s important that I outline some of the key ones to you all today:
- A clear and limited purpose: FSNI focused the use of the technology on identifying people who had committed serious harmful behaviour in the stores in the recent past.
- The system was effective to address that purpose.
- Fit for purpose technology: FSNI chose a technology product that had been proved to work at a high-quality level “in the wild”.
- Immediate deletion of most images: Images that did not trigger a match against the store’s “watchlist” (the image database of people of interest) were deleted almost instantaneously.
- “Watchlists” were generally of reasonable quality and carefully controlled: Staff were not permitted to add images of children or young people under 18, elderly people, or people with known mental health conditions.
- Accuracy levels were acceptable, once adjusted in response to problems: FSNI learned from some misidentification incidents, and instructed staff not to consider intervening unless the match level was at least 92.5%.
- Alerts were checked by two trained staff.
- There was a reasonable degree of transparency that the FRT trial was operating: Stores had clear signage at the entrance alerting customers that the trial was operating, with more signs in store.
- There was no apparent bias or discrimination in how discretion was exercised.
- Processes for requests and complaints: People who considered that they had been misidentified or wrongly enrolled on a watchlist were able to make complaints, and have information corrected or removed if a mistake was found.
- Security processes were in place to protect information: Only authorised people had access to the information on the system, and to the security room in which the equipment was stored.
- And, stores demonstrated an awareness and regard for privacy.
While the trial model complied with the Privacy Act overall, our Inquiry identified further improvements that would need to be addressed before FSNI considered using FRT permanently or expanding it into additional supermarkets.
A few months after that Inquiry report, after what has been a long and considered process, I issued the Biometric Processing Privacy Code 2025.
The Code sets out specific privacy rules for organisations using biometrics.
The aim of the new rules is to allow for beneficial uses of biometrics, while minimising the risks for people’s privacy and society as a whole.
The Code will help make sure agencies implementing biometric technologies are doing it safely, and in a way that is proportionate.
In addition to the usual requirements from the Privacy Act, the Code strengthens and clarifies the requirements on agencies to:
- assess the effectiveness and proportionality of using biometrics – is it fit for the circumstances
- adopt safeguards to reduce privacy risk; and
- tell people a biometric system is in use, before or when their biometric information is collected.
The Code also limits some particularly intrusive uses of biometric technologies like using them to predict people’s emotions or infer information like ethnicity or sex, or other information protected under the Human Rights Act.
My Office believes having biometric-specific guardrails will help agencies deploy these tools safely, using the right tool for the job and protecting people’s privacy rights as they do it.
Guidance has also been issued to support the Code.
The guidance is very detailed and explains how we see the Code working in practice.
It also sets out examples so agencies planning to use biometrics can better understand their obligations.
Our bottom line is that biometrics should only be used if they are necessary, effective and proportionate; the key thing to make sure of is that the benefits outweigh the privacy risks.
Issues around surveillance by public sector agencies have also been to the fore in New Zealand this year.
It’s no secret that we have been engaged in extensive dialogue with the New Zealand Police on the collection of personal information in various ways … and this is only intensifying with the recent discussions and advocacy around the use of body-worn cameras by police officers.
You might be interested to know what positions I and my Office have taken around the broader issues … in the words of the great Australian rock philosophers, the Angels, “I keep no secrets from you”.
In summary:
- The act of recording people in public places, for ongoing use and retention in databases, can have a potential chilling effect on people’s civil and political rights.
- While individuals can be observed in public places, they do not automatically waive all their privacy rights; the right to privacy includes the right to be left alone by state agencies unless there is a reasonable justification for the public surveillance.
- The principles of proportionality and necessity, which are fundamental to the social licence of our democratic institutions, are critically important.
- When Police are photographing people, they can do so when either there is a specific statutory authorisation, or there is full compliance with the information privacy principles.
- When turning their minds to their reasons for collection of personal information using photography, officers must be able to connect this to a policing function or purpose.
- Any risk of indiscriminate collection would be highly concerning; there must be a threshold that means collected information is of reasonable relevance to a policing function.
- I remain concerned about any ability for Police to use information for an unknown possible or potential future use.
- The combined effect of the Privacy Act and Bill of Rights Act ensures that there are effective safeguards to limit indiscriminate collection and retention of information, or the inappropriate surveillance of individuals or particular groups.
- The applicable legislative framework must set limits for the retention, use and destruction of material captured for general Police intelligence gathering, and include appropriate checks and balances.
Now, before anyone says “that’s just the sort of stuff you’d expect a civil liberties privacy regulator to say”, I do want to make the point again that I see myself as a reasonable regulator.
For example, this year we grappled with another issue around how do we balance important uses of personal information, with strong protections for people’s privacy?
In August 2025, following some work by my Office, emergency services got access to device location information (DLI), a new way to find and help people when they cannot call our emergency number 111 (for example if they’re injured or lost).
That’s a great outcome, and it’s one enabled by the Privacy Act.
In New Zealand, what has happened up until this change is that when someone in need did call our emergency number, 111, their network provider could often send information about their location to emergency services (ambulance services, Fire and Emergency, and New Zealand Police).
This sharing of information is enabled by the Telecommunications Information Privacy Code, a legal instrument issued by the Privacy Commissioner under the Privacy Act.
Schedule 4 of that Code sets out the rules that enable emergency services to get this location information quickly, as well as privacy safeguards that keep it safe.
The Privacy Commissioner added Schedule 4 to the Code in 2017, following public consultation on the options, risks, and benefits of this sharing.
Now, the Code also enables the new device location information service, or DLI.
Sometimes there are emergencies where people need urgent help but cannot call 111, such as search and rescue situations.
Use of DLI can help in these situations, but it also involves an intrusion on privacy, particularly when the person is not calling 111 themselves.
That means it’s important there are strong safeguards around when the information is collected and how it is used.
The rules in Schedule 4 of the Code set out strict safeguards for the use of DLI by emergency responders.
These safeguards include:
- Sharing is limited to specific agencies: Police, Fire and Emergency, ambulance services, and organisations involved in search and rescue operations.
- The threshold to use DLI is high. An emergency service provider can only request DLI if they believe it will enable them to prevent or lessen a serious threat to the life or health of the individual concerned.
- Before they use DLI, an emergency service provider needs to check it relates to the right person.
- A person whose DLI is collected must be notified unless this would create a safety risk. This notification will be by a text message to the individual. This may be sent to the person at the time or later on.
- All disclosures of DLI to emergency services must be logged, and the disclosure log must be reported to the Privacy Commissioner every three months.
DLI is not about collecting new information on people’s location, or about tracking individual devices.
It’s about getting information that network providers already hold to emergency services quickly, with good safeguards, where this helps to prevent or lessen a serious threat to someone’s life or health.
At my Office, we often talk about good privacy practices being “how to, not don’t do”.
We think that the story of device location information is a useful example of this, and we’ll be keeping an eye on it to make sure that the goal of upholding New Zealanders’ privacy is met, as it rolls out in practice.
Principled-based regulation
Linked with this focus on surveillance, one of the narratives we are currently dealing with is that – to avoid privacy regulation getting ‘in the way’ (God forbid we let privacy get in the way!) – there needs to be clear and unambiguous rules which, if you follow from A to B to C, you can then do whatever you want, without being pinged by the privacy regulator.
I have a few thoughts on that.
First, you cannot legislate or regulate for every situation or scenario; regulatory flexibility that allows principles to be applied to a particular fact situation seems to make a lot of sense.
And you can’t give a regulatory hall pass to everyone; some will inevitably undertake an activity that is harmful to privacy, and does not pass the test in terms of careful stewardship of personal information - whether through poor planning and implementation, or deliberately.
Far better to go through a process which tests your ideas and plans against a set of carefully specified principles.
For example, in New Zealand, the Privacy Act, Biometrics Code and associated guidance provide a flexible and pragmatic model appropriate for regulating information sharing and FRT in the retail sector.
We’ll be working with retailers, industry bodies, and business associations as they look to use biometric systems or share information, responding to their experience and requests for guidance or feedback as they arise.
This education, awareness and capability building strategy is our standard approach to supporting agencies to comply with new regulatory requirements – as evidenced by our approach to the introduction of the 2020 Act changes.
I appreciate that there will always be a call for more certainty, for rules that apply exactly to a particular organisation’s circumstances, but I think that it would be unwise to take a completely prescriptive approach, particularly because of the need for different approaches across different contexts.
Prescription risks baking in rules that are unduly restrictive for one organisation, or unjustifiably permissive for others.
In both cases, prescriptive rules, particularly in legislation, are difficult to change, requiring political will and significant time and resourcing to do so.
Getting these wrong may have other consequences - for example, for the social licence for new technology use in our countries more broadly.
Further, in a shifting technological landscape, it may be difficult to create rules for a particular kind of technology that are suitably future proofed.
On the other hand, a principles-based approach provides flexibility, allowing multiple ways to comply with the law, as long as agencies can justify their decisions.
Privacy Officers and the privacy ecosystem: telling your story
Before I wrap up, and while acknowledging the many and varied members of the privacy ecosystem here today – law firms, consultants, academics, advocates and others – I want to direct a few comments at privacy officers - those of you who are part of privacy teams, or perhaps the sole person, in government agencies, NGOs, big corporates, medium sized companies, and other regulated agencies.
In New Zealand privacy officers have, of course, a special statutory role and status in privacy legislation – but, whatever your status, you are the people who have “doing privacy stuff in my organisation” in your job descriptions.
I would like to think that, in your day to day work, you are forming alliances not just with your legal team colleagues, but also with those who – in their own way - care deeply about the value of doing privacy well, and the cost of doing it badly … the marketing people, the tech people, the public relations people.
I would also like to think that, in recent years, due to the growing importance given to privacy and the protection of personal information, your jobs and your mission has become a little easier in your agencies.
But, I suspect that, within your organisations – as I find with my role in seeking to bring about real and needed change – there is still what I call “the resistance that lurks behind the mask of willingness”.
All too often, privacy is seen on the cost or liability side of the ledger, and not the income or asset side.
Many of you will have heard me talk about privacy being just good business,
Agency senior managers and boards may feel differently about the value of privacy if they face a multi-million dollar financial penalty for not keeping personal data safe.
They may feel that any privacy-related activity designed to protect against unauthorised disclosure of personal information is worth it, if it means not having to set aside tens of millions of dollars for customer remediation following a data breach, as Latitude Financial did.
And even if they are small, business owners may agree that it’s worth investing in cyber security when they find out from you that that the average self-reported cost of a cybercrime to a small business is over $50,000.
And no one wants to lose customers after all the effort that’s gone into wooing them to your organisation.
But the 2024 Cisco global Consumer Privacy Survey found that more than 75% of consumers said they would not purchase from an organisation that they did not trust with their data.
And what about the tech-savvy generation? – well, Cisco found that almost half of consumers aged 25-34 have switched companies or providers over their data policies or data-sharing practices – they’re not just thinking about it; they’re doing it.
These are all stories to tell … and there are so many more, including case notes and reviews on privacy regulator websites.
Be ready to answer the question “what’s the gain? … what’s the gain from this organisation doing privacy well?”
So, next time you’re in the lift with the chief executive, or the company owner, and they ask you, “who are you?”
You could say “I’m Michael and I’m part of the privacy team” … that’s fine.
But you could also say “I’m Michael and my focus is building and keeping customer trust and confidence in our organisation, and ensuring we’re seen as a safe and trusted place to do business as with.”
My challenge to you, my challenge to myself, is that we all need to be the best privacy story tellers that we can be.