In its new Biometrics Guidance, the UK Information Commissioner Office (ICO), has bravely started to address an issue that so far has remained below the surface in the identity sector. The issue: identity providers re-using the biometric data they are trusted with for their own training purposes.
The ICO notes in its guidance that the practice is “common”. We can only assume the ICO has researched the market and concluded that it is common practice for identity providers to reuse consumer biometric data that it collected to provide services to its clients (typically banks or the government) for its own internal training. Which as a consumer and citizen I find hugely concerning, because it is certainly not common practice to collect consent in the name of the identity provider for this training.
Controlling the data
There is no law that says identity providers cannot use personal data for training, but the identity providers must comply with the GDPR to do this legally. The ICO has made clear in this guidance that its view is that the identity providers are acting as data controllers themselves. This is the first time the ICO has made this conclusion publicly, and means that identity providers can no longer pretend that they are mere processors when they use data for training. This has large implications for the identity sector, as explained below.
The ICO also makes clear that it is very likely that the only lawful processing ground for processing biometrics is consent from the consumer. That means the identity providers need consent in their own names. So giving consent to the bank or other entity verifying your identity is not sufficient — if the biometric identity provider wants to use my biometric data for training, then I need to give consent explicitly to that identity provider.
And that the consent must be freely given, meaning that consumers have a genuine choice whether to say “yes” or “no”. A genuine choice here cannot be connected to the receiving of the service they are trying to access; so if the consumer says “no” to the identity provider, then the consumer should be able to continue with the identity journey anyway.
What it all boils down to
This conclusion by the ICO leads to some important outcomes:
- The identity providers should be collecting consent in its own name from the consumers. So it is not clear how an identity provider that is reusing biometric information for training can supply a fully white-labelled solution to its clients.
- Consumers must have a clear means of withdrawing their consent, which means the identity providers need to inform the consumers of (i) the fact that the identity provider is a controller; and (ii) how to withdraw consent.
- Consumers must be able to exercise their right to data deletion unless the identity provider can demonstrate a requirement to retain the data (which I cannot imagine is possible).
All this means a much less streamlined consumer journey for the clients of the identity provider. Which is not an attractive proposition.
The impact on existing algorithms
The new guidance also calls into question the legality of the algorithms these identity providers have already produced. If the identity providers were not positioning themselves as data controllers before, and not complying with the GDPR, then how can the resulting algorithms be legally used? The Texas attorney general made Meta delete its own algorithms using similar logic.
Meta argued that using a mere photo for biometric training is not use of biometric data. The AG in Texas strongly disagreed, and I suspect a court in the UK or the EU would too. Buyers of biometric data need to be careful that their identity providers do not try to pull that trick of logic on them too when the buyers ask about what data is reused for training.
The solution for buyers of identity services is clear: buy technology trained entirely on generative AI.
About the author:
Peter Violaris is Global DPO and Head of Legal EMEA for IDVerse. Peter is a commercial technology lawyer with a particular focus on biometrics, privacy, and AI learning. Peter has been in the identity space for 6 years and before that worked for London law firms.