On 18 July 2022, the United Kingdom (UK) government set out its new proposals for regulating the use of artificial intelligence (AI) technologies while promoting innovation, boosting public trust, and protecting data. The proposals reflect a less centralised and more risk-based approach than in the EU’s draft AI Act.

The proposals coincide with the introduction to Parliament of the Data Protection and Digital Information Bill, which includes measures to use AI responsibly while reducing compliance burdens on businesses to boost the economy. Continue Reading UK government announces its proposals for regulating AI

Background

On 1 August 2022, the Court of Justice of the European Union (“CJEU”) issued a decision (“Decision”) clarifying how the indirect disclosure of sexual orientation data is protected as special category data under Article 9 of the EU General Data Protection Regulation (“GDPR”). “Special Category Data” is defined within Article 9(1) of the GDPR and includes (for example) a data subject’s racial or ethnic origin or data concerning a natural person’s sex life or sexual orientation. The processing of such sensitive personal data is expressly prohibited, unless the processing is exempted from the prohibition in the sense of Article 9(2) GDPR.Continue Reading CJEU rules on interpretation of EU GDPR special categories of data

On 14 July 2022, the UK Information Commissioner’s Office (“ICO”) has launched a public consultation on its draft strategic three year plan, titled “ICO25”. The plan sets out a commitment to safeguard the information rights of the most vulnerable individuals with the aim of empowering people to confidently share their information to use today’s market products and services, with work particularly targeting:

  • children’s privacy;
  • AI-driven discrimination;
  • the use of algorithms within the benefits system; and
  • the impact of predatory marketing calls.

Continue Reading ICO25: ICO sets out its three year strategic plan

The Information Commissioner’s Office (ICO) announced its intent to fine Bounty (UK) Limited (Bounty) £400,000 for breaching the Data Protection Act 1998 (the Act). Due to the timing of this breach, it was governed by the Act rather than by the General Data Protection Regulation 2016/679 (GDPR). The maximum penalty permitted under the pre-GDPR regime in the United Kingdom was £500,000.

Background

Bounty was a pregnancy and parenting support club. It provided information packs and goody bags to mothers in exchange for personal data. It also provided a mobile app for users to track their pregnancies, as well as offering a new-born portrait service. Its portrait service was the largest in-hospital service of its kind in the United Kingdom.

Bounty had a data protection policy on its website. The data protection policy stated that Bounty: (i) collected personal data for marketing purposes; and (ii) might share personal data with selected third parties. The data protection policy stated that users might receive communications from Bounty or a third party. However, the policy did not specifically identify third parties or the types of third parties that personal data would be shared with.

Bounty also collected personal data using hard copy cards completed in maternity wards. These cards stated that recipients consented to Bounty processing their personal data if the cards were filled in. The cards also briefly outlined the possibility that personal data could be shared by Bounty. However, again, no detail about third party recipients was included. Recipients were obligated to provide their names and postal addresses when filling the cards in. To avail of Bounty’s services, recipients had no choice but to provide some personal data.
Continue Reading Sharing a Bounty of Personal Data? ICO issues £400,000 fine against UK pregnancy and parenting club for illegally sharing personal data

The European Union Agency for Network and Information Security (ENISA) recently published its report on ‘Security and privacy considerations in autonomous agents’.

Artificial intelligence (AI) and complex algorithms offer unlimited opportunities for innovation and interaction, but they also bring a number of challenges that should be addressed by future policy frameworks at the EU level – especially in light of the amount of available data.

One of the objectives of the study was to provide relevant insights for both security and privacy for future EU policy-shaping initiatives. We have summarised some of the key security and privacy recommendations from the report below.Continue Reading ENISA tackles AI head on

A meeting of data protection authorities from around the world has highlighted the development of artificial intelligence and machine learning technologies (AI) as a global phenomenon with the potential to affect all of humanity. A coordinated international effort was called for to develop common governance principles on the development and use of AI in accordance with ethics, human values and respect for human dignity.

The 40th International Conference of Data Protection and Privacy Commissioners (conference) released a declaration on ethics and data protection in artificial intelligence (declaration). While recognising that AI systems may bring significant benefits for users and society, the conference noted that AI systems often rely on the processing of large quantities of personal data for their development. In addition, it noted that some data sets used to train AI systems have been found to contain inherent biases, resulting in decisions which unfairly discriminate against certain individuals or groups.

To counter this, the declaration endorses six guiding principles as its core values to preserve human rights in the development of AI. In summary, the guiding principles state:
Continue Reading Guiding principles for AI development

On 10 July 2018, the Information Commissioner’s Office (ICO) announced its intent to fine Facebook £500,000 for two breaches of the Data Protection Act 1998, the maximum permitted under the pre-GDPR regime. If the penalty is enforced, it will be the biggest issued by the ICO in its history. For some perspective, had the breach occurred following the implementation of the General Data Protection Legislation 2016/679 (GDPR), the social network could have faced a fine of up to £359 million. Facebook now has a chance to respond to the ICO’s Notice of Intent, after which a final decision will be made.

Less than 30 days after issuing a Notice of Intent to fine Facebook, the ICO issued a further penalty as a result of the investigation, this time directed at Lifecycle Marketing (Mother and Baby) Ltd, also known as Emma’s Diary, a data broking company which provides advice on pregnancy and childcare. The ICO issued a £140,000 fine against Emma’s Diary for illegally collecting and selling personal information belonging to more than one million people.

Background

Facebook, alongside Cambridge Analytica, has been the focus of an ICO investigation for over a year. The investigation centred around the use data analytics in political campaigns and was spearheaded by Information Commissioner, Elizabeth Denham. The investigation was formally commenced in May 2017 following the unearthing of evidence that personal data from over 87 million Facebook accounts had been illegally harvested. The ICO described it as one of the largest investigations ever undertaken by a data protection authority, this being reflected in the most recent estimate of the cost of the investigation, which has been put at almost three times the level of the fine with which Facebook has been issued. In addition to the fine, the ICO announced its intent to bring a criminal prosecution against SCL Elections Ltd, the parent company of Cambridge Analytica, for being too slow to adequately respond to an enforcement notice issued in May of this year.Continue Reading What big data, political advertising and big fines have in common

As the European data protection framework evolves, big data remains a hot topic. Often, what makes up these large data sets is personal data, so it has clear data protection implications.

The Information Commissioner’s Office (“ICO”) has therefore issued guidance on “Big data, artificial intelligence, machine learning and data protection.” This recent guidance provides helpful emphasis on accountability, transparency and how to evidence compliance with the General Data Protection Regulation (“GDPR”), which is due to come into effect from 25 May 2018. The ICO’s guidance explains the ways that accountability can be evidenced by organisations (such as, through documentation, algorithms, ethics, etc.).Continue Reading Man vs. machine: the ICO provides guidance on use of Big Data

The upcoming ninth amendment of the German Act against Restraints of Competition (Gesetz gegen Wettbewerbsbeschränkungen, ARC), which has already been approved by the German Federal Parliament (Bundestag) and the German Federal Council (Bundesrat), is expected to enter into force shortly. The new law is tailored to adapt German competition law to the specific features of

The EU Commission recently launched a Public consultation on Building the European data economy. The objective behind the consultation is to feed into the Commission’s future policy agenda on the European data economy in 2017.

The data economy

In its Communication entitled “Building a European Data Economy,” the Commission has re-identified (from its 2012 Communication) the need to upgrade the EU’s legal regarding the trade of data.
Continue Reading Building the EU data economy: time for an upgrade?