On 17 October 2017, the Article 29 Working Party (“Art 29 WP”) published draft guidelines on automated individual decision-making and profiling (“Guidelines”).
In the Guidelines, the Art 29 WP states that profiling and automated decision making can be useful for individuals and organisations by delivering increased efficiencies and resource savings, whilst recognising that they may pose significant risks for individuals unless appropriate safeguards are put in place.
The Guidelines clarify the provisions of the General Data Protection Regulation (“GDPR”) that aim to address these risks.
What is the difference between automated decision-making and profiling?
The Guidelines distinguish between automated decision-making and profiling.
Automated decision-making refers to the ability to make decisions by technological means without human involvement. Profiling, on the other hand, entails the collection of data about an individual and analysing their characteristics or behaviour patterns in order to categorise them and/or make predictions or assessments about their (i) ability to perform a task, (ii) interests; or (iii) likely behaviour.
While the Art 29 WP notes that automated decisions and profiling are distinct, they recognise that something that starts off as a simple automated decision-making process could become one based on profiling depending on the use of the data.
Article 22 profiling
Under the GDPR, individuals have the right not to be subject to decisions based solely on automated processing, including profiling, which produces (i) legal effects concerning them; or (ii) a similarly significant effect on the individual[1]. The GDPR does not define either ‘legal effects’ or ‘similarly significant effects.’
According to the Guidelines, “legal effects” mean any activity which impacts an individual’s legal rights or affects a person’s legal status, or their rights under a contract. The concept of “similar significant effect” is, however, less clear cut. In the Art 29 WP’s view, for the data processing to significantly affect someone, the “effects of the processing must be more than trivial and must be sufficiently great or important to be worthy for attention. In other words, the decision must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned”.
While the Art 29 WP recognises that, in many cases, targeted advertising will not have a significant effect on individuals, they suggest that advertising could meet this threshold in certain circumstances. Relevant factors include:
- the intrusiveness of the profiling process;
- the expectations and wishes of the individual concerned;
- the way the advert is delivered; and
- the particular vulnerabilities of the individual who is ‘targeted.’
A person in financial difficulties who is regularly shown adverts for online gambling is cited as an example of an individual who may suffer a ‘legal or similar significant effect.’ Additionally, automated decision-making that results in differential pricing could potentially have a significant effect if prohibitively high-prices effectively bar someone from goods and services.
Organisations who carry out ‘Article 22 profiling’ are prohibited from using automated processing unless they can rely on one of the following exemptions: (i) performance of a contract, (ii) authorisation by law or (iii) explicit consent. These criteria may be extremely difficult to fulfil – especially for online advertisers.
Other key points
The Guidelines also offer useful guidance on the following:
- Right not to be subject to a decision based solely on automated decision-making: organisations must implement suitable measures to safeguard the individual. This will include the right to obtain human review of the decision by someone who has the appropriate authority and ability to change the decision.
- Rights to be informed and access: organisations must find a simple way of informing individuals about the rationale behind or criteria relied on to reach the decision. The information given should be meaningful, rather than a complex explanation of the algorithms used.
- Audits: controllers will need to carry out regular audits on the data sets they process to check for any bias, and appropriate procedures should be taken to prevent errors, inaccuracies or discrimination on the basis of special category data.
Next Steps
The Guidelines are currently in draft form. The consultation on the Guidelines remains open until 28 November 2017. Time will tell what (if any) changes are made when the Art 29 WP publishes the final form of the Guidelines. Businesses would be well advised to keep an eye out for further updates.