In March 2019, the Information Commissioner’s Office (ICO) released a Call for Input on developing the ICO’s framework for artificial intelligence (AI). The ICO simultaneously launched its AI Auditing Framework blog to provide updates on the development of the framework and encourage organisations to engage on this topic with the ICO.

On 23 October 2019, the ICO released the latest in its series of blogs (here). The blog outlines the key elements that organisations should focus on when carrying out a Data Protection Impact Assessment (DPIA) for AI systems that process personal data.

We have outlined below some of the key takeaways.

Continue Reading AI Auditing Framework: data protection impact assessment

On 15 October 2019, the Information Commissioner’s Office (ICO) released the latest in its series of blogs on developing its framework for auditing artificial intelligence (AI). The blog (here) focuses on AI systems and how data subjects can exercise their rights of access, rectification and erasure in relation to such systems. Below, we summarise some of the key takeaways and our thoughts on the subject.

Rights relating to training data

Organisations need data in order to train machine learning models. While it may be difficult to identify the individual to whom the training data relates, it may still be personal data for the purposes of the General Data Protection Regulation (GDPR), and so will still need to be considered when responding to data subject rights requests under the GDPR. Provided no exception applies and reasonable steps have been taken to verify the identity of the data subject, organisations are obliged to respond to data subject access requests in relation to training data. The right of rectification may also apply but, as an individual inaccuracy is less likely to have a direct effect on an individual data subject that is part of a large data set, organisations should prioritise rectifying personal data that may have a direct effect on the individual.

Complying with requests from data subjects to erase training data may prove more challenging. If an organisation no longer needs the personal data as the machine learning model has already been trained, the ICO advises that the organisation must fulfil the request to erase. However, organisations may need to retain training data where the machine learning model has not yet been trained. The ICO advises that organisations should consider such requests on a case-by-case basis, but do not provide clarity on the factors organisations should consider.

Continue Reading ICO blogs on AI and data subject rights

In July 2019, the UK’s Financial Conduct Authority (FCA) held a week-long Global Anti-Money Laundering and Financial Crime TechSprint (FCA TechSprint) event. The FCA TechSprint looked at ways to effectively combat financial crime and money laundering within the financial services industry. On 16 October 2019, the Information Commissioner’s Office (ICO) released a blog (here) that focuses on the lessons learnt from the FCA TechSprint.

Background

The FCA TechSprint brought together teams from all over the world to explore how encryption techniques known as privacy enhancing technologies (PETs) can facilitate data and knowledge sharing among financial institutions, regulators and law enforcement agencies to detect and prevent money laundering and financial crime, while remaining compliant with data protection and privacy laws.

The teams worked towards developing solutions to the following use cases:

  • how can a network of market participants use PETs and data analytics to interrogate financial transactions stored in databases within institutions to identify credible suspicions without compromising data privacy legislation?
  • how can market participants efficiently and effectively codify topologies of crime which can be shared and readily implemented by others in their crime controls?
  • how can a market participant check that the company or individual they are performing due diligence on has not raised flags or concerns within another market participant, and/or verify that the data elements they have for the company or individual match those held by another market participant?
  • how can technology be used to assist in identifying an ultimate beneficiary owner across a network of market participants and a national register?

ICO’s Regulators’ Business Innovation Privacy Hub was present at the FCA TechSprint to offer guidance on the data protection implications of implementing PETs.

Continue Reading At odds no more: can regulatory collaboration bring innovation and data privacy closer together?

On 12 September 2019, the Committee of Ministers of the Council of Europe announced that an Ad hoc Committee on Artificial Intelligence (CAHAI) will be set up to consider the feasibility of a legal framework for the development, design and application of Artificial intelligence (AI). On the same day, the United Kingdom’s data protection supervisory authority, the Information Commissioner’s Office (ICO), released the latest in its series of blogs on developing its framework for auditing AI. The blog (here), published on 12 September 2019, focuses on privacy attacks on AI models. With interest in the development of an AI legal framework increasing, what does the ICO consider to be the data security risks associated with AI?

Continue Reading Artificial intelligence: ICO considers security risks and the need for a new legal framework

Earlier this year, the Information Commissioner’s Office (ICO) issued a consultation on a draft code of practice for designing age-appropriate access for children accessing online services (Code). The consultation closed on 31 May 2019 but the ICO has recently released an update on its progress in producing the Code.

The finalised Code will be informed

The UK’s new prime minister, Boris Johnson, has vowed that the UK will leave the EU on October 31, 2019. A unilateral (or “hard”) Brexit poses many privacy and data protection challenges for companies that operate in the UK.  Post-Brexit privacy and data protection issues that you need to consider include:

  • how to maintain uninterrupted

Avid readers of this blog (and we trust there are many of you!) will recall that the UK government recently published a white paper. The white paper sets out the UK government’s approach to regulating the internet to tackle online harms. The Information Commissioner’s Office (ICO) has just published the Information Commissioner’s (Commissioner) full

On July 3, 2019 the Information Commissioner’s Office (ICO) published an updated guidance on the use of cookies. Although the guidance confirms requirements of which most data practitioners already comply, it outlines steps for non-compliant companies. Now that the ICO has confirmed its regulatory expectations and detailed immediate enforcement, companies need to take action

The Information Commissioner’s Office (ICO) announced a £100,000 fine imposed on the telecoms company, EE Limited (EE), for breaching the Privacy and Electronic Communications Regulations 2003 (PECR). The timing of the breach meant that the General Data Protection Regulation 2016/679 (GDPR) was not applicable.

What happened?

EE sent customers a text message encouraging them to