Lawmakers around the world have placed strict rules on companies that collect and process biometric data of consumers, and the lawmakers in Australia and New Zealand are no different. It is a great responsibility to process biometric data because, unlike most other types of data, you cannot ever change your biometric data.
In Australia, an entity subject to the Privacy Act can only process sensitive information—which includes biometrics—with the express consent of the individual. But what does “express consent” mean and how do entities obtain it? And how do companies lawfully process biometrics in New Zealand where there is no consent requirement?
Australia
The law
The primary piece of legislation is the Privacy Act. Schedule 1 of the Privacy Act contains a set of Australian Privacy Principles (APPs) that govern the collection and processing of personal data, including biometric information. Consent from each consumer to the collection and use of biometric data is a key prerequisite to processing of their biometrics.
APPs 3 and 6 of the Privacy Act gives some helpful rules around how to lawfully collect consent. APP 3 states that any collection of data must be ‘lawful and fair’ and APP 6 sets out a list of information an individual must be told, including the consequences of not allowing the collection by the entity.
The regulator
Building on these principles in the APPs, the Office of the Australian Information Commissioner (OAIC) has set out some guidance on the collection of consent:
Consent must be informed
- This means the user must be informed in clear and intelligible English how their data will be used. Good practice is to communicate “just in time” as the data is collected in a few simple statements, with a linked privacy policy containing a more detailed explanation. The privacy policy should be accessible at any time by the individual.
- Unfortunately, we do still see very long processing notices in the user journey that are far too long and complex, especially if trying to read on a mobile device, meaning that very few consumers really understand what they are consenting to. This is a problem that OAIC could perhaps address with specific guidance on this point.
Consent must be voluntary
- Here, OAIC expands on APP 6. The entity needs to tell the individual about the consequences of not giving consent. If the only way through the journey is to provide consent to the processing of biometrics, and no alternative route is offered, then the consumer will be a lost sale to the entity.
- This also means that individuals cannot be forced into giving consent. For example, an employer could not introduce mandatory biometric gates at work because the staff would have no choice but to give consent, otherwise they would lose their job. Any consent given would not be voluntary.
- It also means that for essential services such as banking, health, and government services, the individual cannot be required to use a biometric onboarding or authentication method; there must be a valid alternative option. The ATO’s digital identity app is based on biometrics for onboarding and authentication. It is possible to log into the ATO without use of the biometric app, but it is nowhere near as convenient as using the app. It would be interesting if the ATO was ever challenged on this point by OAIC or in the courts.
- Interestingly, the EU and the UK under the GDPR also require consent to be a genuine choice, but the US differs on this point. In the US it is common practice to use biometrics to record staff arriving and leaving work.
Consent must be current and specific
- Entities cannot collect data from consumers for unspecified purposes just in case they might need the data in the future. The collection must be for a lawful purpose that is explained to the consumer at the time of collection. Entities need to tread the balance between providing information that is too vague—which would be unlawful—and too specific, which might mean that if the purpose changes very slightly they do not have the lawful right to use the data for the slightly amended purpose.
- APP 6 distinguishes between the “primary purpose” of processing, and a “secondary purpose” when the entity changes the reasons for processing slightly. For biometric data, the change in processing can only be where the secondary purpose is “directly related” to the primary purpose; which is intended to be a high bar.
The individual must have capacity to give consent
- This obligation is to ensure that the individual is of age to consent and has mental capacity.
There is always a balance to be struck by entities when obtaining consent between the first requirement of OAIC (clear and intelligible English) and providing all possible information to individuals. From a UX perspective, if an entity tries to provide too much information to the user on the screen in order to provide clarity then the user will not engage with the information and the opposite effect will be achieved; the user will not understand what they are being asked to consent to. Entities should provide as little key information as possible on the actual screen with more detailed information for users in a linked privacy policy.
Another risk of too much information is that consumers develop “consent blindness,” where users just give consent without even trying to understand what they are consenting to. Arguably this has happened with the proliferation of cookie consent. Consumers (myself included) just tick blindly on the cookie consent pop-up without reading the information presented or considering the options. There is an opportunity for OAIC to publish some good guidance for designers and UX teams on what a ‘good’ consent page looks like.
The new IVS consent requirements
The recently published Participation Agreement for accessing government passport and driver’s license databases for identity verification—known as “DVS checks”—includes stricter consent requirements. Companies conducting DVS checks must now meet these enhanced consent standards. Over 140 million DVS checks are conducted each year meaning that these new stricter consent requirements for DVS checks will impact on tens of millions of Australians and tens of thousands of businesses.
Under the newly passed Identity Verification Services Act (IVS Act), the DVS administrator (part of the Australian government) is required to enter into a contract with each entity doing DVS checks so that individuals’ data could be better protected. The deadline for getting each DVS check recipient (or Business User in IVS Act speak) signed to the DVS Participation Agreement is June 2025. In that Participation Agreement, there are detailed requirements on the consent to be collected from individuals before DVS checks can be conducted.
Whilst the DVS checks currently do not include biometric checks, in future the face verification service (FVS) will be made available to private companies and the same consent requirements will be in place.
When collecting consent from a consumer to check their data with the DVS, the entity must set out:
- How it will use the DVS;
- What legal obligations it has in relation to that collection of the data;
- What rights the individual has in relation to the collection of the data;
- The consequences of the individual declining to consent;
- Where the individual can get information about making complaints relating to the collection, use and disclosure of the data; and
- Where the individual can get information about the operation and management of the DVS service by the Commonwealth in connection with the requesting and provision of the DVS.
The list of required information to be given to individuals arguably goes further than the Privacy Act and the OAIC guidance. It will not be practical to contain this on a single app page on mobile devices, meaning that some will be relegated to a linked privacy policy. There is a danger that entities try to put all of this information on the single consent page with the result that hardly any individuals will understand what they are consenting to.
The other impact on this list of information to be given to individuals is that for the consent to be valid, the entities will have to offer genuine alternatives to identity verification without a DVS check. Since DVS checks are central to Australian identity verification, it’s unclear what alternatives remain if a consumer refuses both a DVS check and a biometric check—other than requiring an in-person visit.
What makes all of this so much harder for entities conducting DVS checks is that the DVS Administrator considers that a mere photo of a person (i.e. on a document image) is a biometric itself. Collecting an identity document for a DVS check also involves processing biometrics, requiring explicit consent. Additionally, document images should not be stored, as that would mean storing biometric data.
It will be interesting to see what approach entities take to obtaining consent for DVS checks.
New Zealand
The law
The Privacy Act 2020 in New Zealand does not require consent for the collection or processing of any personal data or even biometric data. In fact, other than a small sub-part of the Act, biometric data is not treated differently to other personal information.
It is fair to say that New Zealand is a little unique in its approach to biometrics across the Western world. Australia (via its own laws and regulations), the EU, and the UK (via the GDPR) all require consent for processing biometrics. Some key US states also mandate consent, leading companies to collect it nationwide despite the lack of a federal law.
In order to tighten the law relating to biometrics, the Office of the Privacy Commissioner in New Zealand (OPC) is consulting on an updated biometric law, and its planned approach is to not require consent at all for the processing of biometric data. The view within the OPC is that online users pretty much always give consent when asked to without considering the impact properly, and that therefore obtaining consent should not be a lawful ground of processing.
Instead, the latest approach to biometrics in the Privacy Code for Biometrics (which is out for consultation currently), introduces some tough standards for entities to meet to process biometrics.
The Privacy Code is a set of Rules for processing of biometrics. Rule 1 puts in a high bar for entities to meet before they can process biometrics. The entity must be sure that the processing is (i) lawful; (ii) necessary; (iii) proportionate to the impact on individuals; and (iv) put in place privacy safeguards.
The “necessary” test is probably the hardest and most important test for entities who are processing biometrics. It requires consideration of effectiveness and whether there are alternatives that could be used.
It will be really interesting to see how this plays out in practice. The risk is that entities will be nervous about using biometrics because the “necessary” test appears a high bar to meet; it will always be possible to identify or reauthenticate a person without using biometrics, but it may not be as secure, robust, or efficient. Does that make it “necessary”?
The other issue with the “necessary” test is that it is not black and white for a regulator to enforce. The OPC will have to investigate, consider, determine, and then enforce against each entity on an individual basis. This is true for any regulatory action but the key difference is that the “necessary” or not determination is open to challenge given there will be so many edge cases. This means both that the OPC will take more time over each consideration stage and also that entities are more likely to challenge any decision of the OPC in the courts.
New Zealand risks a scenario where cautious, law-abiding entities avoid biometrics despite their security benefits, while less ethical ones use them improperly, believing they can challenge the OPC in court.
Summing it all up
Australia and New Zealand are two nations with shared heritage and very close cultural ties. They are both the best of friends at a national level and perhaps the closest of rivals on the sports field. Both nations’ laws are based on the English common law.
But despite the commonalities between Australia and New Zealand, the two nations have each taken a very different approach to consent. For biometric companies trying to do business across the Tasman Sea, compliance with both sets of laws is a challenge.
About the post:
Images and videos are generative AI-created. Prompt: Stylized checkbox being clicked by a stylized cursor. Next to the checkbox are the words, “I consent to biometrics”. The checkbox should have a modern, sleek design with a glowing outline, and as the cursor clicks, the box should display a bold checkmark. The cursor should be dynamic, possibly with a motion blur or glowing effect to emphasize action. The overall aesthetic should be clean and futuristic, with a focus on sharp lines and smooth gradients. Tools: Midjourney, Luma.
About the author:
Peter Violaris is Global DPO and Head of Legal EMEA for IDVerse. Peter is a commercial technology lawyer with a particular focus on biometrics, privacy, and AI learning. Peter has been in the identity space for 7 years and before that worked for London law firms.