Our website uses cookies so we can analyse our site usage and give you the best experience. Click "Accept" if you’re happy with this, or click "More" for information about cookies on our site, how to opt out, and how to disable cookies altogether.

We respect your Do Not Track preference.

Resources and learning

A woman stands and puts her finger into a fingerprint scanner to gain access to another space. She has a brown bobbed hairstyle with a fringe and wears a blue business shirt and blue jeans. She is carrying a notepad. The final step of rule 1 is assessing the proportionality of the collection.

On this page:

You must not collect biometric information unless you believe, on reasonable grounds, that the biometric processing is proportionate to the likely impacts on individuals. To assess whether the biometric processing is proportionate, you need to assess:

  • The scope, extent and degree of privacy risk from your biometric processing. 
  • Whether the benefit of achieving the lawful purpose through the biometric processing outweighs the privacy risk. 
  • The cultural impacts and effects of biometric processing on Māori.

Privacy risk

A key part of the proportionality assessment is determining the degree of privacy risk presented by your use of biometrics. 

Under the Code, privacy risk is any reasonable likelihood  that the privacy of individuals may be infringed by the biometric processing. A privacy infringement is any impact or effect of the biometric system that may limit, undermine or encroach on an individual’s privacy or deter individuals from exercising their rights. 

The concept of infringement is broader than an interference with privacy (see section 69 of the Privacy Act) in order to take account of the subtler systemic or aggregate impacts of using biometric systems that erode people’s privacy, like the impacts of monitoring public spaces, as well as distinct harms to individuals. 

When considering privacy risk, think both how likely it is an event will occur, and what the consequences would be if an event occurred. 

The Code lists examples of possible privacy infringements that can result from using a biometric system / biometric processing, which includes:

  • You collect more biometric information or keep it for longer than is necessary.
  • The biometric information collected is not accurate.
  • There are security vulnerabilities affecting the information.
  • There is a lack of transparency about how you are collecting biometric information.
  • Individuals are misidentified or misclassified because of the biometric processing, including where the misidentification or misclassification is due to differences in demographics such as race, age, gender or disability.
  • An individual may have adverse actions taken against them (e.g. a person is denied access to a service) or they may be deterred from exercising their rights (e.g. right to freedom of movement or freedom of expression) because of the use of biometric processing for the purposes of surveillance, monitoring or profiling. This risk could apply whether the surveillance, monitoring or profiling is done by a public or private organisation.
  • There is an unjustified expansion of the use or disclosure of biometric information after it is collected.
  • The ability of individuals to avoid monitoring is diminished in spaces where they may reasonably expect not to be monitored. Again, this risk is relevant regardless of whether the monitoring is done by a public or private organisation, and regardless of whether the monitoring occurs in a public or private space. “Monitoring” is more than just being seen or watched. Monitoring could include that a person’s actions or movements are specifically followed, noted, or a decision is made because of what the person does.
  • Any other infringement of the privacy interests of individuals or any other infringement of the protections for biometric information in the Code.

Although the Code lists certain privacy risks that you must consider, the context of your biometric processing is key to understanding the privacy risk, and you may need to take into account risks that aren’t listed in the Code.

Note: requiring that agencies take into account the risk of privacy infringements in the proportionality assessment does not change the threshold for a successful privacy complaint under the Code i.e. finding an agency has interfered with an individual’s privacy through their use of a biometric system.

Back to top of page.

How to assess privacy risk

All biometric processing has some risk, but some forms of biometric processing are higher risk than others.

To assess the privacy risk posed by your biometric processing, you could use the following framework based around what information you are collecting, whose information it is, why you are collecting it, and when, where and how you are collecting it.

What – the volume and nature of the information

  • What information are you collecting? How sensitive is that information? (Information that is more inherently connected to a person, or is particularly revealing, difficult to change or hide will generally make information more sensitive).

  • How much information are you collecting? How many people’s information?

  • Could the information be used to reveal other information about a person or profile them?

  • Can the information be easily linked with other information?

Who – who is the information about and who collects it:

  • Whose information are you collecting? Are they vulnerable in some way? Are they more likely to be negatively impacted by any bias or discrimination? e.g. children, minority group, experiencing distress or reduced capacity to exercise privacy rights.

  • Is there a power imbalance between you and the people whose information you are collecting? (e.g. employer/employee, landlord/tenant, government agency with enforcement powers, a provider of critical service with few alternatives vs. a provider of non-critical service with lots of alternatives).

  • Have individuals freely authorised the collection? Are there realistic alternative options if individuals want to opt out of biometric processing? (Authorisation with a genuine alternative is an important risk mitigation if it is practical for your circumstances).

  • Have you consulted with people whose information will be collected?

Why – why are you collecting the biometric information?

  • What is your purpose for collecting information? Is it broad and conceptual or clear and targeted? (Impacts the risk of scope creep).

  • How complex is the use case? Does it involve multiple steps, dependencies, opaque systems, information flows or discretion? Or is it simple and straightforward? (Impacts opportunities for misuse or failure, harder to maintain transparency and accountability).

  • What are the consequences, if any, for individuals? Is the system to support taking adverse actions against individuals? What would be the impact of errors or inaccuracy of the system on individuals? Is there a contingency or backstop process? (Impacts the likelihood and severity of harm).

How – context and design of the system (including where and when it is operating)

For this dimension, think through the personal information lifecycle and your operational choices.

  • How is the information collected? Covertly, remotely or directly? Actively or passively? (Affects transparency and ability for individuals to exercise choice and privacy rights).

  • What is the context – public space, private space, retail, entertainment? Might the system deter people from exercising their protected rights or reduce the ability of individuals to avoid monitoring where they may not expect to be monitored? (Public spaces, semi-public spaces or private spaces that are essential for individuals to access increases the risk because of the potential chilling effect and surveillance risks. Whereas non-essential private spaces, especially where individuals have a range of alternatives, will generally lower the risk).

  • How is the information processed? Live or retrospective? High- or low-quality inputs? Centralised or local processing? (Relevant to surveillance, accuracy and data breach risks).

  • Is information stored? How long for? (Likelihood of security risks).

  • How is the information protected? Who has access to information? What are the security controls? (Likelihood of misuse, unauthorised access, information lost or stolen).

  • Is the information routinely or occasionally shared? Or collected, processed or stored on your organisation’s behalf? What protections are in place? (Risk of data breaches, unauthorised sharing or unlawful secondary use).

  • Are there policies around who can access the data and what it can be used for? Audit logs? (risk of misuse, security breaches, unauthorised access).

Some privacy risks in biometric systems are inherent and can’t be changed. For example, why the information is collected (collecting for public surveillance will pose more risk than for 1:1 verification) or who the data is about (children or other vulnerable groups raises the inherent risk). Other risks can be managed or mitigated through design choices, including how much information much information you collect, how you collect, store and protect it, or how long you retain it. 

Remember that privacy risks also arise in the collection and use of non-biometric information. Assessing the relative risk of biometric processing compared to processing of other personal information can form part of your assessment of whether there is a reasonably effective alternative with less privacy risk. See the section on alternatives for more detail.

Summary of lower and higher risk factors

This table provides a high level summary of factors that OPC may consider to increase or lower the privacy risk. Each assessment still turns on its own facts and you need to consider how your own context and specific processing impacts the degree of risk.

Factors that tend to lower risk

Factors that tend to increase risk

System’s scope is limited or targeted scope of system, i.e. 1:1 verification, use impacts a small number of individuals or a select group.

System’s scope is wide or indiscriminate, i.e. 1:N identification, use impacts many individuals or society more generally.

Likelihood of errors lower: 

  • High quality biometric probes/references.
  • Operation in controlled environment.

Likelihood of errors higher:

  • Low quality biometric probes/references.
  • Operation ‘in the wild’.

Established uses of biometrics with robust scientific basis and high accuracy.

Emerging or novel uses of biometrics with uncertain effectiveness and limited research/scientific basis. 

Use in places where individuals would reasonably expect to be asked to confirm their identity. 

Use in public spaces, or private/semi-private spaces where people don’t expect to be monitored.

Little to no power imbalance between individuals and agency (e.g. a provider of an optional commercial service with lots of competitors, individual has high level of consumer power).

Significant power imbalance between individuals and agency (e.g. agency with law enforcement powers, a provider of a critical service with few or no competitors, employment relationship, meaningful/significant age difference).

Low impact on individual if a privacy risk eventuates e.g. system may inconvenience individuals or produce delays if it’s inaccurate. 

High impact on individual if a privacy risk eventuates e.g. causes inability to access a social or essential service, information or facility that individual is warranted to access, causes humiliation, distress or stress, or financial impact if system doesn’t work properly.

Individuals who are the subject of processing less vulnerable to privacy harms e.g. have high capacity or ability to exercise privacy rights, not part of a vulnerable group in society, unlikely to experience bias. 

Individuals who are the subject of processing more vulnerable to privacy harms e.g. lower capacity or ability to exercise privacy rights, individuals more likely to experience bias, part of a vulnerable group in society.

Examples:

Verification to authenticate user or facilitate access. 

Examples: 

Identification for surveillance, monitoring or profiling. 

Look at the whole picture

Assessing the overall risk requires you to consider the biometric system as a whole and the context in which your biometric processing will take place. You need to consider all factors which increase or decrease your risk. The modifiable risk factors (such as what information is collected), are a way to mitigate the risk by changing how the system operates to reduce your overall risk. Read more about protecting biometric information to lower the overall privacy risk.

Unacceptable risk

In some cases, there may be factors which make the risk unacceptable. For example, if you do not have sufficient security safeguards to meet the requirements in rule 5 to keep the information secure. Similarly, if the accuracy of the system is not high enough to meet the requirement in rule 8 to ensure information is accurate before use. If the risk is unacceptable, you cannot continue with collecting biometric information unless you can sufficiently decrease the risk.

Back to top of page.

Weighing benefits against risk

Adopting biometric systems should be driven by an analysis of the benefits and risks, rather than the availability or appeal of the technology. To support this, the proportionality assessment requires organisations to weigh up:

  • the benefit of using biometric processing to achieve the lawful purpose 
    against
  • the scope, extent and degree of privacy risk.

This section discusses how to assess the benefit and how to do the weighing exercise; more guidance on risk is included in the privacy risk section.

Assessing the benefit of a biometric system

An organisation’s purpose for implementing a biometric system is the reason the system is needed (the “why”). The benefit of a biometric system is the positive effect or value that results from its implementation (the “consequences from the why”, or “why does the why matter?”). The achievement of your purpose may also be your benefit if your purpose has been expressed in sufficiently specific terms.

Because a biometric system is an automated way of recognising someone or inferring certain traits, the benefits of using a biometric system will typically reflect the fact that it’s an automated process. For example, increased efficiency, reliability or consistency, reduced manual errors (if the system is sufficiently accurate), faster turnaround times, quick decision making, user convenience and streamlining processes. 

  • Example: If an organisation wants to use a biometric system to verify clients’ identities as part of the organisation’s obligations to prevent money laundering (the purpose of collecting biometric information), the benefits of using the system might include: a high level of accurate verifications, increased convenience for most clients by allowing remote verification, reduction in time to process a verification, and removing the need to retain scanned copies of identity documents.

Be clear and specific

When assessing the benefit of achieving your purpose, you need to be clear on the specific benefit you expect to achieve and what kind of benefit it is (public benefit, benefit to the relevant individual/s, or private benefit to the organisation). 

You should clearly document the benefit. Like your purpose, the benefit must be specific and directly linked to the biometric processing. For example, the benefit needs to be more specific than a generic “improved customer experience”, or “improved safety” – be clear on the actual specific improvement and how it will be achieved through biometric processing. You should be able to explain what the problem is you are trying to solve, or what would happen without the biometric processing.

Examples of specific benefits:

  • Benefit to the organisation: Increased security of access to a restricted information database by using fingerprint scanning as a form of multifactor authentication. This will reduce the risk of unauthorised access to the restricted information.
  • Clear benefit to individuals: Improved customer experience for entering facility through offering facial recognition as an alternative option to increase the speed of entry and eliminate the need to carry a physical access card, thus increasing customer satisfaction for those who choose to use the facial recognition option.
  • Public benefit: Improved efficiency and security at the New Zealand border through the use of biometric-based passport controls.

The better the biometric system works, the greater your benefit 

The benefit is related to how effective your biometric processing is in achieving the intended purpose – the more effective a biometric system is at doing what it was set up to do, the greater the benefits produced. The reverse is also true, less effective or unfit systems will provide fewer benefits, and it will be harder to determine that they are proportionate. (See also the section on effectiveness). 

You also need to have reasonable grounds for assessing the scale of the benefit. For example:

  • What is the level of increase in staff and customer safety?
  • To what extent can this increase be directly attributed to the biometric processing?
  • What is the increase in the level of security of the information database?
  • What is the expected improvement in customer satisfaction?
  • How much more effective will the facial recognition system be over the existing process?

It is not necessary to have an exact percentage improvement, but you should have a general idea of how much benefit comes from the biometric processing– e.g. a moderate improvement in customer safety or a small increase in security of information access.

Does the benefit of the biometric system outweigh the risk?

Once you have clearly established what the expected benefit of your biometric processing is, you need to consider whether that benefit outweighs the privacy risk, taking into account the different standards that apply to the categories of benefit. What you are asking is whether the benefit e.g. additional security, efficiency, time, convenience, user experience, reduced cost, justifies that risk that you identified when assessing your privacy risk. 

As outlined above, a biometric system can benefit the public, the individual whose biometric information you are collecting, and/or the organisation collecting the biometric information. Depending on who is accruing the benefit of the system, the Code requires a slightly different assessment when considering whether the benefit outweighs the privacy risk:

  • A public benefit needs to outweigh the privacy risk. A benefit is not a “public benefit” just because it may benefit some members of the public. A public benefit is when there is a benefit for a meaningful section of the public.
  • A benefit to the individuals whose biometric information you’re collecting needs to be a clear benefit, and it needs to outweigh the privacy risk. This means that the benefit to the individuals needs to be obvious and specific. For example, if the benefit to the individual is increased convenience, this should be an obvious and specific improvement for that individual – not just a general improvement in broader convenience that may or may not benefit that individual.
  • A benefit to the organisation collecting the biometric information needs to outweigh the privacy risk by a substantial degree.

Public or customer opinion (e.g. that the public is supportive or not of the biometric processing) can be relevant to both the benefit and privacy risk but is not in itself determinative. That is, just because a majority of your customers may support or not oppose the processing, does not mean that the benefit will outweigh the risk.

If your biometric system is high risk, you will need a correspondingly significant benefit for the processing to be overall proportionate. If your system is low risk, then it could be proportionate even if you have only identified minor advantages from using the system, like 50 percent more efficient and improved user experience. If your system presents a high level of risk, but you only achieve modest or limited benefits, you will need to modify the risk to be lower (see the section on privacy risk) or the processing will not be proportionate.

Read the rule 1 example scenarios to see how the weighing exercise could work in practice.

What if my organisation is using biometric processing to achieve several different purposes and benefits?

If you intend to use biometric processing for multiple lawful purposes and the purposes have benefits in different categories, then you should consider the proportionality for each purpose separately, according to the relevant benefit for each purpose. 

The purpose of your biometric processing needs to be proportionate when considering only one of the benefit categories. For example, if your purpose has advantages to multiple groups (e.g., there is both a public benefit and a clear benefit to the people subject to the processing), your system still needs to be proportionate based on just one of the benefit categories (i.e. you can’t tally multiple small benefits to claim the system is proportionate). 

Example: a retail store intends to use FRT for the purposes of improving staff and customer safety and preventing stock losses by generating alerts to guide an appropriate staff response when individuals on a watchlist enter the store. The store considers the proportionality of each purpose separately:

  • There is a public benefit for the staff and customer safety purpose. To be proportionate this benefit needs to outweigh the privacy risk.
  • There is a private benefit to the store from the loss prevention purpose. To be proportionate this benefit needs to outweigh the privacy risk to a substantial degree.

If you are running a trial, how do you meet the proportionality requirement?

If you are running a trial under rule 1 to assess how well the biometric system works, then you may not know in advance exactly how beneficial using biometrics in your context is, until after the trial is complete.

The Code requires that you have reasonable grounds to believe that your biometric processing is proportionate. This threshold (belief on reasonable grounds) is flexible depending on the circumstances. Therefore, if you are thinking about conducting doing a trial of a biometric system, you need to believe, with good reason, that the trial is a proportionate course of action given the privacy risks and likely or anticipated benefits).

We would expect there to be a reasonable and objective basis for your belief that at the end of the trial, there will be a benefit that outweighs the risk, assuming that the trial demonstrates that the system is sufficiently effective. Read our trial guidance.

Back to top of page.

A Māori woman with a moko on her chin is standing in front of a red background. She is speaking, and her hands are loosely clasped in front of her chest. Cultural impacts and effects on Māori

Part of determining whether your proposed use of biometrics is proportionate is working through the cultural impacts and effects of the biometric processing on Māori.

How would this affect your proportionality assessment? 

The Code requires you to have reasonable grounds to believe that the biometric processing is proportionate to the likely risks and impacts on individuals, after specifically taking into account the cultural impacts and effects on Māori. Identifying and addressing cultural impacts and effects is a necessary part of the proportionality assessment. 

Cultural impacts and effects could result from:

  • Cultural perspectives (e.g. tikanga Māori, Māori data sovereignty, te Tiriti o Waitangi and He Whakaputanga o te Rangatiratanga o Nu Tireni) that affect how Māori view or are impacted by biometric processing.
  • Any different impact the biometric processing has on Māori, for example discrimination against Māori due to bias in the biometric system (e.g. bias leads to adverse decisions against Māori individuals at a higher rate than non-Māori).

Māori perspectives on privacy and biometric information

Biometric information is of cultural significance to Māori

Personal characteristics such as a person’s face or fingerprints are so inherent to the identity of a person that Māori treat them with special sensitivity. They are imbued with the tapu of that individual which restricts the way in which biometric information is engaged with. From a Māori perspective, tikanga (values and practices) such as tapu, whakapapa, mauri, noa, mana, hau and utu should influence how you collect, store, access, maintain and disclose biometric information.

Violations of biometric information require appropriate redress 

A failure to observe Māori perspectives on privacy and biometric information may result in a hara or violation. In addition to any other harm, a hara creates a disparity between the parties involved. Such violations can impact the whakapapa, tapu, mana, mauri and hau of the affected party and must be corrected by the offending party, for example through an apology, karakia, reparation, rectification of the technology or finding alternatives for the individual to use. 

Māori data sovereignty

Māori data sovereignty gives effect to the inherent sovereign rights and interests Māori have over the collection, ownership and application of their data as a taonga under te Tiriti o Waitangi. The principles of Māori data sovereignty are a self-determining framework that influences the way that Māori control their information, including biometric information. Te Mana Raraunga (the Māori Data Sovereignty Network) have outlined how key principles translate to concrete ways to protect Māori data and can help all organisations consider how the use of biometrics could impact and affect Māori.

Government agencies must consider te Tiriti 

Government agencies will need to consider any use of biometric information in the context of te Tiriti obligations and while considering the power imbalance between the Crown and Māori. For instance, how do principles such as tino rangatiratanga and partnership impact the use of Māori biometric information? 

Definitions for key concepts

The definitions below come from Kukutai, T., Campbell-Kamariera, K., Mead, A., Mikaere, K., Moses, C., Whitehead, J. & Cormack, D. (2023). Māori data governance model. Te Kāhui Raraunga.

  • Mātauranga: Māori knowledge system.
  • Mauri: life force.
  • Taonga: those things and values that we treasure, both intangible and tangible.
  • Tapu: sacred, restricted, or prohibited.
  • Tikanga: custom, rules.
  • Whakapapa: genealogy.

Identifying and addressing cultural impacts

First, organisations must make a reasonable effort to assess what the cultural impacts and effects on Māori could be. Then, consider whether and how to address them. 
What this requires in practice will change depending on your specific use case and context.

In general, we expect agencies to consider:

  • Is it appropriate to specifically partner or engage with Māori whose information you intend to collect to gather their views? If so, who should you engage with – whanau/hapū/iwi, Māori individuals, Māori communities, all of the above?
  • What is the risk of discrimination and bias against Māori from the use of the biometric system?
  • Do you know what tikanga are engaged by your use of biometrics? Is your intended collection and use of biometrics consistent with those tikanga? 
  • Is your planned use of biometrics consistent with principles of Māori data sovereignty?
  • Will Māori individuals/groups be involved in the ongoing co-governance, partnership, oversight or audit of your biometric system? If so, what representation from the people whose biometric information you are collecting will be necessary?

Once you have identified the potential cultural impacts and effects on Māori, if there are any negative impacts or effects, you need to consider whether and how to address those impacts. Some impacts or effects may not be able to be addressed. Failure to address those impacts or effects is a factor to be considered and may make the processing less proportionate.

If you do not have the internal expertise to make these assessments, you should consider whether it is appropriate to engage external advisers to provide cultural advice. The “further resources” section has links to other guidance which could assist you.

Do you need to collect ethnicity information to comply with this requirement? 

You need to address cultural impacts whether or not you know specific ethnicity or cultural information about each impacted individual. In most cases, if you are undertaking biometric processing in New Zealand, it is very likely you will be collecting Māori biometric information and so it is important that you consider what cultural impacts there may be. 

In general, you do not need to collect ethnicity or cultural information to be able to consider potential cultural impacts. But, if you think you need to know how many Māori people may be impacted by your processing, you could consider using other metrics such as general population information to help you in your assessment.

Handling biometric information in accordance with tikanga 

Collecting, storing and using biometric information in accordance with tikanga is one way of addressing cultural impacts and effects (but it is not the only way). Some starting points include:

  • Ensuring that an individual’s mana, mauri, hau, whakapapa and tapu is respected throughout the collection, use and disposal of biometric information. 
  • Considering the protection of Māori biometric information from a collective, rather than solely individual, perspective. In some cases, it may be appropriate not to privilege individual privacy at the expense of the collective benefit. 
  • Ensuring that biometric data of living individuals is not stored with biometric data of deceased individuals to protect their tapu.
  • Holding Māori biometric information in New Zealand.
  • Consideration of the concepts of utu (reciprocation) and ea (resolution or balance) in addressing any privacy breaches.

An example of a specific cultural concern for Māori is capturing images of moko and moko kauae (traditional facial tattoos), e.g. through a facial recognition system. Moko contain deeply sensitive and tapu information about an individual’s identity such as whakapapa, whānau/hapū/iwi, whenua, ancestors and origins. Even if the biometric system does not specifically analyse the moko itself, the use or misuse of images that include moko can affect the tapu, mana and mauri of the individual, and their whānau, hapū and iwi.

Free, prior and informed consent

Free, prior and informed consent is an important principle underlining Māori data sovereignty (principle: manaakitanga | reciprocity). Free, prior and informed consent involves agreement from the individual to the collection and use of their biometric information based on adequate information, appropriate timing and an absence of coercion. It gives people a genuine choice and autonomy over their information.

There must be a genuine alternative available to individuals to access for consent to be considered free, prior and informed. If there is no alternative, then individuals cannot freely consent.

In many cases, if you can gain the free, prior and informed consent of individuals before collecting their biometric information, this will be a valuable way to mitigate any negative cultural impacts.

Consent that doesn’t meet the standard of free, prior and informed should be balanced by strong governance arrangements (partnership, oversight, and/or accountability mechanisms).

While free, prior and informed consent is not a mandatory part of the Code (and will not be appropriate or feasible for all circumstances), if it is an option for your specific use case, it is a good way to address cultural impacts.

Choose the right technology and processes to avoid bias

Bias in a biometric system is critical to identify and address to avoid negative impacts on Māori. Bias can enter the biometric system in different ways, from technical bias (i.e. repeatable errors produced by algorithm making a comparison or inference) to human bias (e.g. assumptions or judgements by people acting on the results). Mitigations include: 

  • Due diligence when choosing your technology or biometrics provider.
  • Testing and/or auditing results.
  • Putting processes in place to check unconscious bias.
  • Ability for people to challenge and/or correct results.

Protecting children and young people 

Take particular care when using a biometric system that may put the privacy rights of tamariki or rangatahi at risk e.g. adding them to a watchlist. In Mātauranga Māori, children are highly valued and considered taonga with inherent mana. They are a significant whakapapa link to the past and future. There is a strong imperative to minimise collection of their biometric information as they are more vulnerable to harms. In addition, breaches of privacy can affect whānau trust in government, business and systems.

Resources

Read our example scenarios of how an organisation might apply rule 1 in context.