Data Protection Impact Assessment

In March 2019, the Information Commissioner’s Office (ICO) released a Call for Input on developing the ICO’s framework for artificial intelligence (AI). The ICO simultaneously launched its AI Auditing Framework blog to provide updates on the development of the framework and encourage organisations to engage on this topic with the ICO.

On 23 October 2019, the ICO released the latest in its series of blogs (here). The blog outlines the key elements that organisations should focus on when carrying out a Data Protection Impact Assessment (DPIA) for AI systems that process personal data.

We have outlined below some of the key takeaways.

Continue Reading AI Auditing Framework: data protection impact assessment

Background

On 4 October 2017, the Article 29 Working Party (“WP29”) released its final guidelines on Data Protection Impact Assessments (“DPIA”), which were initially proposed in draft form in April 2017. Article 35 of the General Data Protection Regulation (“GDPR”) provides that the controller shall carry out an assessment of the impact of the envisaged processing operations, if the type of processing is likely to result in a high risk to the rights and freedoms of natural persons. A failure to comply could lead to a fine of up to €10 million, or up to 2% of the total worldwide annual turnover, whichever is higher.

The WP29’s final version provides additional guidelines, particularly the criteria to be applied in determining whether or not a DPIA is mandatory, and how to carry out a DPIA. We explore some of the key guidelines below.

Changes to Criteria

Under the GDPR, conducting DPIAs is required if the data processing is “likely to result in high risks”. Although the GDPR provides examples of data processing operations that would fall into this category, both versions of the guidelines mention that this is a “non-exhaustive list”.

The WP29’s final guidance reduces the criteria for determining whether a DPIA is mandatory to nine considerations – removing international transfers as a factor. Controllers may consider this as an advantage, given many data processing activities involve international transfers.

The relevant criteria include:

  • Evaluation or scoring (including profiling and predicting)
  • Automated decision-making with legal or similar significant effect
  • Systematic monitoring
  • Sensitive data or data of a highly personal nature
  • Data processed on a large scale
  • Matching or combining data sets
  • Data concerning vulnerable data subjects
  • Innovative use or applying new technological or organizational solutions
  • When the processing prevents data subjects from exercising a right or using a service or a contract


Continue Reading Article 29 Data Protection Working Party Publishes Final Guidelines on Data Protection Impact Assessments