As the European data protection framework evolves, big data remains a hot topic. Often, what makes up these large data sets is personal data, so it has clear data protection implications.

The Information Commissioner’s Office (“ICO”) has therefore issued guidance on “Big data, artificial intelligence, machine learning and data protection.” This recent guidance provides helpful emphasis on accountability, transparency and how to evidence compliance with the General Data Protection Regulation (“GDPR”), which is due to come into effect from 25 May 2018. The ICO’s guidance explains the ways that accountability can be evidenced by organisations (such as, through documentation, algorithms, ethics, etc.).

Big data attracts the need for new software developments, advancements in data mining techniques, and processing fast enough to manage big data’s speed, variety and volumes. The fact that big data analysis can be carried out by machine learning or artificial intelligence (“AI”) means that keeping up with the pace behind these new developments requires careful consideration of how to properly manage personal data.

The ICO’s guidance does not steer people away from engaging with big data. Quite the opposite. It encourages its use. The ICO highlights that data protection does not only relate to the safety and the rights of data subjects and data controllers, but will also allow access to higher-quality data; the benefits include fostering creativity and innovation, and potentially more effective use of AI.

The GDPR requires a Data Protection Impact Assessment (“DPIA”) to be carried out for the majority of big data applications that process personal data. A DPIA is an important tool that can help to identify and mitigate data protection risks. This process can be difficult when dealing with big data analytics, given the complexity and unexpected uses of personal data associated with its use. The ICO stresses that data analytics professionals are aware of the data protection implications of DPIAs, and that such information be built into the formal qualifications required for those roles. This will also build into the ethics and culture of organisations to develop data protection compliance.

The ICO provides six key recommendations which it believes will help organisations achieve compliance around big data:

  1. Using appropriate techniques to anonymise personal data;
  2. Providing meaningful privacy notices so organisations are being transparent about the processing of personal data;
  3. Embedding a DPIA framework into big data processing activities;
  4. Adopting a privacy-by-design approach in the development and application of big data analytics;
  5. Developing ethical principles to help reinforce data protection principles; and
  6. Implementing innovative techniques to develop audible machine learning algorithms.

If data protection risks are considered at the outset and are properly addressed, this will be beneficial to businesses and consumers. This is emphasised in the ICO’s guidance:

“By recognising these benefits here, we do not intend to set up this paper as a contest between the benefits of big data and the rights given by data protection. To look at big data and data protection through this lens can only be reductive.”

So, this guidance should be interpreted as a pause for thought and an opportunity, rather than as a reason to shy away from big data analytics.