Technologies such as the Face ID of the iPhone XS or Facebook’s facial recognition in photos reopen the debate on the privacy policy of biometric data . Gemma Garcia Lopez Emotional Data Analyst & Journalist and researcher investigates the tentative privacy policies and the debate surrounding personal data.

Everyone has the right to the protection of his or her own personal data, and this is stipulated in Article 8 of The Charter of Fundamental Rights of the European Union. However, the creation of new technologies and the massive use of data have caused the population to question how big companies handle their privacy policies and the use they make of personal information. Especially after living the controversial data leak starring Facebook and Cambridge Analytica.

With the aim of establishing the rules that govern the protection, treatment and circulation of personal data, on May 25, 2018, the General Data Protection Regulation (EU) 2016/679 is finally implemented. This regulation brings many changes, but also adds a priority element: the privacy of biometric data.

Article 4 (14) Definitions: ‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.

This new regulation not only includes privacy for data extracted from tools such as facial recognition or eye tracking, but also these data are classified as one of the special categories of personal data. Article 9 of the GDPR prohibits the processing of biometric data aimed at univocally identifying a natural person, however, its use could be justified in the event that the interested party expresses their explicit consent, is used to protect vital interests, for the fulfillment of obligations and the exercise of the right and other aspects included in point 2 of article 9.

 

These exceptions leave companies that use this type of data a door ajar and the need to request the consent of the user as a secure route. For these cases, this new regulation has introduced the figure of the Data Protection Officer. Its functions include: supervising compliance with the regulations, cooperating with the supervisory authority before proceeding with the processing of the data or informing and advising the responsible party and the employees who work with mentioned data.

 

The summary of this regulation is simple: companies can no longer use and process personal data at their own discretion. Not at least in Europe. Not at least if it is properly done. This has caused large companies such as Facebook, Apple or Google, have to adapt their privacy policies for their users within the European Union (EU) and the European Economic Area (EEA).

In the case of Apple, we can see the difference between the iPhone X and the new iPhone XS, XS Max and iPhone XR. When they launched the iPhone X with the Face ID, they only needed to apply the standards of their privacy policy. Now, after the introduction of the iPhone XS, and with the implementation of the GDPR, Apple will have to adapt this technology in Europe to the new regulations. If this is not the case, the Data Protection Delegate should carry out an impact evaluation, action required under Article 35 of the GDPR “when it is likely that a type of treatment, particularly if it uses new technologies, due to its nature, scope, context or purposes, entails a high risk for the rights and freedoms of natural persons … “.

Facebook’s facial recognition in photos is also in the spotlight, because it makes use of biometric data and these should not be incorporated under a standard configuration. The identification of the people in a photograph came to the accounts of the users by default, but now with the new regulation, it would become a special category due to being treated with specific technical means that allows the identification and authentication of a natural person. If the data is to be used, it must be regulated.

“Businesses are able to process biometric data if they have explicit consent from the individual. It is important to point out that “explicit consent” is different to regular consent under GDPR”, explains Danny Ross, Data Privacy and Security Attorney. But then, how will they get that consent from now on? We may be faced with the proliferation of the fine print in the Terms and Conditions, the “I consent” or “I agree” boxes or the arrival of new loyalty strategies. As Ross points out, from now on, the methods that companies will use to obtain consent become important.

We are facing a new law that raises big questions and new challenges. While the GDPR evolves, companies have an obligation to respect and comply with the law, and users must be knowledgeable and claim their rights.