Our website uses cookies so we can analyse our site usage and give you the best experience. Click "Accept" if you’re happy with this, or click "More" for information about cookies on our site, how to opt out, and how to disable cookies altogether.
We respect your Do Not Track preference.
The Code applies to biometric information as a class of information and to the activity of biometric processing by a biometric system.
The Code applies to all organisations - businesses, government agencies, NGOs - that collect biometric information for biometric processing (with limited exceptions). “Agency” is the term used in the Privacy Act, but we’ve used the term “organisation” in this guidance. Agency is defined in section 4 of the Privacy Act.
See “What does the Code not apply to” for more information.
Biometric information is information about a biometric characteristic that is used for the purpose of biometric processing by a biometric system. Biometric characteristic includes:
Biometric information also includes:
Biometric information does not include any information about an individual’s biological or genetic material (e.g. blood or DNA), brain activity or nervous system.
There are many different types of biometric systems and possible uses for biometric information. Some of the most common types of biometric information/biometric systems are:
A biometric system means a computer- or technological-based system that is used for biometric processing. It includes any related devices and components needed to carry out the processing, such as cameras, scanners, comparison algorithm and tokens e.g. the FRT system used for border control uses ePassports (token) and eGates (camera and comparison process).
A biometric system typically uses both hardware and software elements to calculate an outcome (comparison score / match) or control a process (facilitate access) and may involve human input, assistance or oversight.
It does not include a system that relies solely or primarily on human analysis i.e. a purely manual system.
The key question is who or what is analysing the biometric information? If the analysis is being performed by the biometric system, then it will be included within the definition and subject to the Code. But if the analysis is solely or primarily done by a human, then it won’t fall within the definition and won’t be subject to the Code. If it is not subject to the Code, it will still be subject to the Privacy Act.
Examples of biometric information covered by the Code |
Examples of information not covered by the Code |
|
A photograph of someone’s face that is being used in a facial recognition system (also called FRT). |
A photograph of someone’s face which you are using in an internal newsletter. |
|
Footage of someone walking that will be analysed by a biometric system to identify the person by their gait. |
Footage of someone walking from a CCTV system that will not be processed in a biometric system |
|
A recording of someone’s voice which will be analysed by a biometric system to identify that person. |
A recording of someone’s voice that is not analysed by a biometric system e.g. a recording of a call taken for record-keeping purposes. |
|
Information about someone’s mood which you learn about through analysis by a biometric system. |
Information about someone’s mood which you learn about through the person taking a survey. |
|
Numerical information extracted from an image of someone’s face to represent their features (biometric template). |
A DNA or blood sample. |
Biometric processing means comparing or analysing biometric information, using a biometric system, to either verify, identify or categorise a person.
Biometric verification is the automated verification of an individual’s claimed identity. It involves comparing a person’s biometric information with other biometric information that has been previously associated with them (e.g. previously enrolled in the system or in an identity document) to confirm whether they match (i.e. are sufficiently similar). It asks the question “Is this person who they say they are?”. Verification is often used as a security measure to protect personal information or prevent fraud e.g. when someone uses an electronic passport gate at the airport. Verification is sometimes called one-to-one (1:1) matching.
Biometric identification is the automated recognition of a person’s biometric characteristic (e.g. face, fingerprints etc) to identify them by comparing their biometric information against the biometric information of multiple people held in the system. It asks the question “Is this person on the database?” or “Do we know this person?”. Identification is used to identify people who are allowed to enter a space and facilitate access to that space, or law enforcement might use it to identify persons of interest on a watchlist. Biometric identification is sometimes called one-to-many (1:N) matching.
Biometric categorisation means analysing a person’s biometric information to learn certain things about them, e.g. using a biometric system to detect someone’s emotions, infer their gender from video footage or estimate their age from their face. We have more information about biometric categorisation.
Examples of biometric processing activities covered by the code |
Examples of activities not covered by the Code |
|
Using a machine-based facial recognition system to identify when individuals in a database enter your business, and a staff member confirms how to respond. |
Having a staff member with a list of people’s faces look out for those individuals. |
|
Using a software program to automatically compare someone’s driver’s licence against another photo of that person to confirm that it is the same person. |
Manual comparison of a driver’s licence with another photo to confirm the person is the same. |
|
Using an algorithm to produce a list of possible identities of a person based on their face. |
Having a staff member manually produce a list of possible identities of a person. |
|
Automated analysis of CCTV footage to identify when an individual is at a site. |
Manual review of the CCTV footage. |
|
Use of age-estimation software to estimate age of users based on facial features |
A staff member using their human judgement to estimate customer’s age. |
Note: The Information Privacy Principles (IPPs) apply to personal information that is not covered by the Code.
Biometric categorisation is when you use an automated process to analyse biometric information to collect, infer or detect or generate certain types of sensitive information or to categorise the individual into a demographic group.
Biometric categorisation covers the collection or inference of the following types of sensitive information:
Biometric categorisation does not include using a biometric system to detect readily apparent expressions, gestures or movements which are things you can observe or record visually or aurally without using biometric processing. For example, whether an individual is nodding or has their eyes closed, whether they are whispering or shouting, or whether the individual uses a wheelchair or is wearing a mask.
This exclusion means that, in general, processes that detect aspects of a person’s face or body to apply a filter or virtual try-on feature, or editing software that categorises people in photos or videos to modify or sort them, will not be subject to the Code (but may still be subject to the Privacy Act).
Biometric categorisation also does not include any analytical process that is integrated in a commercial service or consumer device and is for the purpose of providing the user with their own health information, personal information, or an entertainment or immersive experience.
In general, processes in consumer wearables (e.g. in fitness trackers) that provide the user with their information or processes in face and body tracking cameras used to facilitate immersive video games (e.g. in VR headsets) will not be subject to the Code (but may still be subject to the Privacy Act).
See rule 10 for more information about the limits on biometric categorisation.
Biometric categorisation includes collecting, obtaining, inferring or detecting… |
For example, using an automated process to… |
|
Health information. (See also the section on when the Code applies to health agencies.) |
|
|
Personal information relating to an individual’s personality. |
|
|
Personal information relating to an individual’s mood. |
|
|
Personal information relating to an individual’s emotion. |
|
|
Personal information relating to an individual’s intention. |
Note: readily apparent expressions are excluded from the biometric categorisation definition: e.g. inferring that the wearer of a VR headset wants to go in a certain direction from their gaze or movements that is externally observable without automated processing. |
|
Personal information relating to an individual’s mental state. |
|
|
Personal information relating to an individual’s state of fatigue. |
|
|
Personal information relating to an individual’s alertness. |
|
|
Personal information relating to an individual’s attention level. |
|
|
To categorise the individual as part of a demographic category assigned to an individual on the basis of a biometric characteristic. |
|
Biometric categorisation does not include… |
For example… |
|
Detecting a readily apparent expression. |
|
|
Personal use and entertainment exclusion: Any analytical process that is integrated in a commercial service, including any consumer device, solely for the purposes of providing individuals with: their health information, their personal information or an entertainment or immersive experience. |
|
In general, the following activities will not be regulated by the Code as they do not fall within the definition of biometric categorisation (or verification or identification). These activities may still involve the collection and use of personal information, in which case the organisation carrying them out must comply with the Privacy Act.
If you are doing the above types of activities, you should still consider the definitions of biometric verification, identification and categorisation to be confident that your specific use is not covered by the definition.