Blog: Using biometric data in a fair, transparent and accountable manner

Gigacycle > Information & Guidance  > Blog: Using biometric data in a fair, transparent and accountable manner

Blog: Using biometric data in a fair, transparent and accountable manner

10 May 2019

As technology takes ever greater strides, so organisations and businesses are harnessing its capabilities to help manage their contact with customers, including using it for means of identification and authentication.

While there are undoubtedly significant benefits in using new technologies, organisations need to be aware of the potential challenges when choosing and using any systems involving biometric data.

In January 2017, HMRC adopted a voice authentication which asked callers to some of its helplines to record their voice as their password.

A complaint from Big Brother Watch to the ICO revealed that callers were not given further information or advised that they did not have to sign up to the service. There was no clear option for callers who did not wish to register. In short, HMRC did not have adequate consent from its customers and we have issued an enforcement notice ordering HMRC to delete any data it continues to hold without consent.

In the notice, the Information Commissioner says that HMRC appears to have given `little or no consideration to the data protection principles when rolling out the Voice ID service’.

She highlights the scale of the data collection – seven million voice records – and that HMRC collected it in circumstances where there was a significant imbalance of power between the organisation and its customers. It did not explain to customers how they could decline to participate in the Voice ID system. It also did not explain that customers would not suffer a detrimental impact if they declined to participate.

The case raises significant data governance and accountability issues that require monitoring. We therefore plan to follow up the enforcement notice with an audit that will assess HMRC’s compliance with good practice in the processing of personal data.

It was also found that a data protection impact assessment (DPIA) that appropriately considered the compliance risks associated with processing biometric data was not in place before the system was launched.

Any organisations planning on using new and innovative technologies that involve personal data, including biometric data, need to think about these key points:

1) Under the GDPR, controllers are required to complete a DPIA where their processing is ‘likely to result in a high risk to the rights and freedoms of natural persons’ such as the (large scale) use of biometric data. A DPIA is a process which should also ensure that responsible controllers to incorporate ‘data protection by design and by default’ principles into their projects. Data protection by design and default is a key concept at the heart of GDPR compliance.

2) When you’ve done your DPIA, make sure you act upon the risks identified and demonstrate you have taken it into account. Use it to inform your work.

3) Accountability is one of the data protection principles of the GDPR – it makes you responsible for complying with the GDPR and says that you must be able to demonstrate your compliance by putting appropriate technical and organisational measures in place.

4) If you are planning to rely on consent as a legal basis, then remember that biometric data is classed as special category data under GDPR and any consent obtained must be explicit. The benefits from the technology cannot override the need to meet this legal obligation.

This is the first enforcement action taken in relation to biometric data since the advent of GDPR when, for the first time, biometric data was specifically identified as special category data that requires greater protection.

Our guidance on informed consent provides advice for organisations planning to use these kinds of systems and we are currently developing our guidance on biometric data.

With the adoption of new systems comes the responsibility to make sure that data protection obligations are fulfilled and customers’ privacy rights addressed alongside any organisational benefit. The public must be able to trust that their privacy is at the forefront of the decisions made about their personal data.

Steve Wood Steve Wood is Deputy Commissioner for Policy and responsible for the ICO’s policy position on the proper application of information rights law and good practice, through lines to take, guidance, internal training, advice and specific projects.

Go to Source

No Comments

Sorry, the comment form is closed at this time.