Check your compliance to the updated ICO guidance on cookies

On July 3, 2019 the Information Commissioner’s Office (ICO) published an updated guidance on the use of cookies. Although the guidance confirms requirements of which most data practitioners already comply, it outlines steps for non-compliant companies. Now that the ICO has confirmed its regulatory expectations and detailed immediate enforcement, companies need to take action to avoid censure immediately.

To read more on the updated guidance, what consents are needed and what they should look like, click here.

State AGs continue to consider new ways to protect data privacy

As states’ “top cops,” one of the primary responsibilities of state attorneys general (AGs) is consumer protection, and more and more AGs are focusing on how to protect consumer data privacy. Discussions at the recent Conference of Western Attorneys General (“CWAG”) Annual Meeting in Santa Barbara reflect this focus and demonstrate that state enforcers are looking at new authority, and new ways to expand their existing authority, to ensure that their citizens’ data privacy is protected.

These discussion took place during two separate panels, one focused on data privacy in general (moderated by American Samoa AG Ale and including senior AG staff from Arizona and New Mexico) and the other on connected devices (the Internet of Things or “IoT,” moderated by New Mexico AG Balderas and including senior AG staff from Colorado). The panels were wide-ranging, touching on issues such as the use of jail-broken IoT devices for piracy, proposed state legislation to prevent the use of unauthorized device repairers, and the protection of children online. Each panel, however, came back to a central focus: that states are continuing to look for more weapons in their data privacy enforcement arsenals.

Most importantly, panelists discussed the expanded use of unfairness authority under the FTC Act (and by extension, state laws on unfair and deceptive acts and practices). In the past, these laws were used primarily as tools to ensure that privacy representations that companies made were true and not deceptive (as the FTC did recently with respect to companies representing compliance with safe harbor frameworks). Use of unfairness authority would greatly expand the manner in which these laws are enforced. Similarly, panelists also discussed greater use of antitrust laws to investigate and take action, if necessary, against big tech companies.

Additionally, panelists called for the creation of new laws, citing both California’s IoT law (which goes into effect next year) as well as the possibility of GDPR-like federal legislation. That said, panelists made it clear that they would not welcome new laws that entrench incumbents, and any new law should put the onus for data privacy on the data collector (similar to the data fiduciary-like standard that has been introduced in New York). Panelists stressed that the key question for policymakers, enforcers, consumers and the tech industry is whether consumers have a property right in their data, which likely will be answered in the next few years, one way or another.

These panels, and countless other examples, continue to confirm that AGs are not going away when it comes to data privacy. In the likely absence of meaningful federal legislation, and with both new state laws such as CCPA and expanded use of existing authority, AGs continue to lead the charge. If companies are not yet thinking about their AG strategy, they should do so now.

CMA launches market study into online platforms and digital advertising

The Competition and Markets Authority (CMA) has recently published an invitation to take part in its market study into online platforms and the digital advertising market in the UK.

This market study has not come out of the blue. The ICO recently published a report on adtech in which the ICO concluded that many actors involved in real time bidding were in breach of the General Data Protection Regulation 2016/679 (GDPR). Earlier this year Lord Tyrie, chair of the CMA, wrote of the need for reform of the UK competition regime, likening the current regime to “an analogue system…in a digital age”.  In particular, Lord Tyrie identified the need for regulators to combat “rapidly-emerging forms of consumer detriment” caused by digital markets, as well as address the “public doubt whether markets work for their benefit”.

The topics set out for the market study address exactly these points raised by Lord Tyrie.

The market study seeks further information on the following three key themes.

     i)     The market power of online platforms

The study seeks to better understand the role of data used by online platforms, and whether incumbent actors have raised barriers to entry for new market participants by monopolising data. The CMA is keen to investigate ‘walled gardens’ of data collected by online platforms which then only share aggregated data to partners. The CMA has identified this as a key concern in its investigation.

     ii)     Consumer agency

Do consumers have the knowledge, skills and desire to control the collection and processing of their data? This is a key question the CMA hopes the market study will shed some light on. It is hoped that this fact-gathering stage will eventually lead to measures recommended by the CMA which will facilitate greater consumer control of data.  This, in turn, may increase market competition and deal with some of the issues around entrenched market actors to be further explored under the first key theme.

     iii)     Competition among digital advertisers

The CMA would also like to better understand what effect market concentration is having on the digital advertising market. The CMA acknowledges the complexity of the market, but also identifies a lack of transparency in how it operates. In particular, a better understanding of money and data flows will be a priority for the market study.

The CMA has focused the scope of its study on online platforms that are funded by digital advertising. The CMA study will also differentiate itself from the UK Government’s recent report and white paper on online harms by focusing on the digital advertising market rather than whether instances of digital advertising involve user deception.


Adtech, digital advertising, and the operations of online platforms seem to have piqued the regulatory ear recently. The CMA invites comments from all interested parties on the issues raised in the study scope until 30 July 2019. If you would like to take part, please review the statement of scope and email responses to the CMA here. The CMA aims to publish its final report in July next year. We expect further developments in this area in the meantime, so keep an eye on our blog for upcoming alerts.

Not quite everything everywhere – ICO fines EE £100,000 for unsolicited text messages

The Information Commissioner’s Office (ICO) announced a £100,000 fine imposed on the telecoms company, EE Limited (EE), for breaching the Privacy and Electronic Communications Regulations 2003 (PECR). The timing of the breach meant that the General Data Protection Regulation 2016/679 (GDPR) was not applicable.

What happened?

EE sent customers a text message encouraging them to use the ‘My EE’ app and to consider upgrading their mobile handsets. A second round of messages was also sent to customers. In total, around 2.5 million text messages were sent by EE to customers who had not provided their consent in early 2018.


EE’s position was that the text messages sent were service messages and not covered by the electronic marketing rules. The text messages, however, were found to contain promotional content encouraging customers to buy further products and services.

The ICO found in its investigation that these text messages amounted to marketing messages, engaging the provisions in PECR. Indeed, regulation 22 PECR requires that marketing messages may only be sent if prospective recipients have given their consent and if they had a simple way to opt out of marketing when their details were initially collected and in every marketing message sent. Individuals have the right to opt out of receiving marketing at any time, at which point it’s the organisation’s responsibility to stop sending them.


EE has the chance to appeal the monetary penalty. In a similar context, another company subject to a fine by the ICO was able to overturn its fine on appeal at the First-Tier Tribunal (General Regulatory Chamber).

This is the latest fine in a line of enforcement actions taken by supervisory authorities here in the UK and elsewhere in Europe.

This case underlines the importance of reviewing and ensuring compliance of marketing processes and the recording of consents. We will keep you informed of further developments in this area.

German Parliament voted ‘Yes’ to Second GDPR Implementation Act

In a late night session on 28 June 2019, the German Parliament (Bundestag) passed the Second GDPR Implementation Act (2. Datenschutz-Anpassungs-und-Umsetzungsgesetz EU – 2. DSAnpUG-EU; the Act). The Act is available online in German here and here. For more information on the First German GDPR Implementation Act read our blog here.

The Act will amend 154 German laws. It includes a list of all laws that will be amended on page 9. The changes mostly include editorial alignments with the GDPR and (sector specific) legal bases for data processing. The Act also includes some changes to the German Federal Data Protection Act (Bundesdatenschutzgesetz – BDSG; FDPA), most notably increasing the number of employees that are necessary for the designation of a data protection officer (DPO) from 10 to 20.

Main changes to the FDPA

The main changes to the FDPA include:

  • New legal basis for processing of special categories of personal data: The Act introduces a new legal basis for the processing of special categories of personal data by public and private organisations – such processing can now be justified if it is absolutely necessary on the grounds of significant public interest (Section 22(1)(d) FDPA).
  • Electronic form of consent for employment-related processing: Where the processing of employee data is based on consent, such consent can be given electronically, in addition to in writing (Section 26(2) FDPA).
  • Increase in required number of employees for DPO appointment: The required number of employees of a data controller or processor that must constantly deal with the automated processing of personal data for the appointment of a DPO was increased from 10 to 20 (Section 38(1) FDPA).
  • New legal basis for processing for purposes of public awards and honours: The new Section 86 FDPA provides that personal data (including special categories of personal data) may be processed by public and private organisations without the knowledge of the data subject for the purpose of granting public honours and awards. The data subject rights in Articles 13-16, 19 and 21 GDPR shall not apply.


The main change under the Act is the amendment of the requirements for DPO appointment in Section 38(1) FDPA. This shall simplify compliance for small and medium-sized enterprises as well as non-profit organisations. However, they must still comply with all other GDPR requirements if the GDPR applies to them. It might, therefore, still be worthwhile for small and medium-sized enterprises to (voluntarily) appoint a DPO.

The new legal basis for processing personal data under Section 22(1)(d) FDPA is very vague and will require interpretation from supervisory authorities and the German courts.

The German legislator has also missed the chance to end the debate around some rather controversial topics that have arisen since the GDPR entered into force. In particular, German courts are currently split on the question of whether competitors are entitled to injunctive relief (cease and desist orders) for GDPR violations under the German Act against Unfair Competition. The German legislator has failed to provide clarification on this matter in the Act.

Further, the German legislator has not included a provision in the Act to reconcile the right to privacy with the right to freedom of expression and information despite being required to do so under Article 85 GDPR.

The Act is still subject to approval by the German Federal Council (Bundesrat).

You can find out more about the implementation laws of all EU member states in our factsheet here.

GDPR on its first birthday – people know what it is but aren’t sure what it does

Never one to miss a bandwagon, the European Commission has published three documents to mark the first year of GDPR:

  • a Eurobarometer survey on data protection (Eurobarometer Survey);
  • a multi-stakeholder expert group (MEG Report); and
  • guidance on the free flow of non-personal data within the EU (reported on here).

We set out some of the key findings below.

The Eurobarometer Survey

Following the lead of the ICO, which recently published the results of its own survey into online harms, the Eurobarometer Survey looks at data protection issues in EU member states.

The Eurobarometer Survey was compiled from data gathered from over 27,000 surveys. Around two-thirds of respondents had heard of the GDPR. However, only one-third had both heard of the GDPR and, crucially, knew what it actually was. Respondents in Sweden (63 per cent), the Netherlands (60 per cent), Poland (56 per cent), Denmark (51 per cent), Ireland and the Czech Republic (both 50 per cent) were the most likely to have heard of GDPR and know what it is.

Around three in five respondents knew about their local data protection authority. Respondents in the Netherlands (82 per cent), Latvia (76 per cent), Finland and Sweden (both 74 per cent) were most aware of their local data protection authorities.

Surprisingly, only around three-quarters of respondents use the internet daily. Those that do use the internet are recurrent users of social media and online shopping sites, with around four in five using the internet for those purposes. Just over half of internet users use a social network every day.

Another surprising finding is that around three in five respondents say they read privacy notices. Just under half of internet users claim they read privacy notices in full, while just over one in ten internet users admit to reading privacy notices only in part. The most common reason given for not reading privacy notices in full was their length.

The MEG Report

In contrast to the Eurobarometer Survey, the MEG Report draws on the discussions of interested organisations, experts and other stakeholders. Its findings drill down further into some of the issues around user sentiment raised in the Eurobarometer Survey.

The MEG Report focuses more on the experience of SMEs and the implementation of GDPR. In particular, SMEs have raised concerns about the lack of exceptions available to them under GDPR. SMEs report that legacy information technology systems have made their GDPR compliance difficult and costly to achieve. Similarly, certification mechanisms are not financially attractive to SMEs due to their high cost.

Respondents were also concerned about additional compliance requirements envisaged under the ePrivacy Regulation. Respondents in the telecoms and online services sectors raised the issue of having to repeat compliance steps already taken for GDPR. However, respondents saw the ePrivacy Regulation, together with GDPR, as “important building blocks for restoring confidence of consumers in the digital economy”.

Respondents were generally satisfied with their local data protection authorities, with many highlighting guidance provided as being a great help during the GDPR implementation period. However, companies carrying out business in more than country cited the lack of consistent GDPR application across EU member states as a problem. Respondents stressed the importance of avoiding a fragmented approach to GDPR through local laws.


GDPR’s first birthday anniversary has been a popular time for aggregated analysis across a range of areas including enforcement trends, AI, and user experience online. The Eurobarometer Survey and MEG Report illustrate that GDPR has bedded down relatively well and established itself as a law that most Europeans are aware of. Whether this awareness can be sustained over the next year is worth tracking. Many companies are struggling from privacy fatigue from staff, users, and customers.

For more GDPR anniversary reading (we are not averse to bandwagon-jumping either!), please have a look at our series of thought-pieces. These include what to consider for your GDPR year two to-do list and how some specific industries and sectors have been affected by GDPR’s first year.

Not just any data, this is smart data – UK government consultation

Earlier this month, the UK government launched its Smart Data Consultation (Consultation). The Consultation follows the publication of the terms of reference which launched the smart data review late last year, and seeks input on proposals to:

  • enable data-driven innovation in consumer markets;
  • use data and technology to help vulnerable consumers; and
  • ensure consumers and their data are protected.

What is smart data?

Smart Data is an extension of the right to data portability which individuals have under the General Data Protection Regulation. The crux of what makes consumer data ‘smart data’ is that it can be easily and securely transmitted from one third party service provider to another, so that the data can be used to provide innovative services to consumers. The Consultation envisages that smart data will:

  • be immediately transferred on the request of the consumer;
  • use Application Programming Interfaces (APIs) to share data securely,
  • where required, be formatted as a continuous flow of data between service providers rather than a one-off transmission;
  • adhere to certain common technical standards, data formats and definitions to minimise access barriers and ensure interoperability; and
  • if necessary, be accompanied by certain product and performance data to facilitate further innovation.

Data driven innovation in consumer markets

The Consultation envisages that a key use of smart data will make consumer management of their household bills easier. This can be facilitated by allowing smart data flows between third party providers to make it easier for consumers to switch services. An objective of the Consultation is the use of smart data will allow for billsplitting services, automatic switching services, and advanced comparison tools, allowing consumers to find the best services for them based on their historic data and preferences. This interoperability has an overlap with other consumer protection and competition principles that allow consumers to switch easily between services, even though data protection and the right of portability has been treated as an independent right under EU law.

The Consultation is focused on consumer smart data within regulated markets including financial services, energy and communications. The Government anticipates that it may introduce legislation introducing smart data initiatives into other markets, if Parliamentary time permits.

Smart data and vulnerable consumers

Open banking is an initiative which smart data is in part derived from. It is identified by the Consultation as offering many benefits to vulnerable consumers which can be adopted by smart data outside of banking. In particular, the Consultation sees smart data as allowing for services which are more accessible, simpler to use, require less effort and offer less complex decisions for vulnerable consumers to face. The Consultation cites a trial in the energy sector where the regulator, Ofgem, gathered data on tariffs paid by consumers. Consumers who had paid higher rate tariffs without switching for a period were invited by Ofgem to switch tariffs to avoid being penalised for loyalty by not receiving more cost effective tariffs offered to new consumers. Consumers who were invited to switch tariffs were nearly 10 times more likely to switch than a control group who received no such invitation.

Protecting consumers and their data

The Consultation has set out the following proposed rules for smart data:

  • third party providers should only access smart data once the explicit consent of the consumer has been gained and verified through a secure authentication process;
  • access to smart data should be limited and revoking access should be as easy as giving access;
  • only accredited third party providers should be able to access high risk data (there is no information in the consultation about such an accreditation process);
  • third party providers should only be able to pass smart data to other third party providers for i) approved purposes and ii) once the consumer has provided their consent; and
  • in the event of data loss or misuse there should be clear liability rules and swift redress mechanisms in place for consumers.


The Consultation seeks to facilitate the transmission of data from one service provider to another as well as combine competition and consumer protection issues on interoperability between services providers and systems. The Consultation is open until 6 August 2019. If you would like to be involved, click here. As always, keep an eye on the TLD blog for further updates in this area.

Your next steps following the ICO update on real-time bidding and adtech

On June 20, 2019 the Information Commissioner’s Office (ICO) published an Update Report on real-time bidding (RTB). Following the recent GDPR one-year anniversary of implementation, the ICO has made adtech a focus for the upcoming year. Although RTB has not been made obsolete, the report denotes all current RTB practices as non-compliant with the GDPR, and warns against future involvement with RTB practices. Although fines and strict enforcement have yet to be made, the ICO says they will release another report in 6 months, and immediate action to address non-compliance should be made.

To read more on the update report and actions you need to take, click here.

FTC and state law enforcement officials step up efforts against illegal telemarketing

The Federal Trade Commission (FTC) announced a joint state-and-federal initiative, “Operation Call It Quits,” which targets illegal telemarketing practices that violate the FTC’s Telemarketing Sales Rule (TSR).

The TSR, which applies to interstate telephonic marketing communications intended to “induce the purchase of goods or services or a charitable contribution,” makes it illegal to engage in “abusive” acts and practices like failing to transmit caller identification information, calling telephone numbers listed on the National Do Not Call Registry, and using certain types of prerecorded messages or “robocalls.” The TSR also makes it illegal to engage in “deceptive” acts and practices while on a telemarketing call, like processing billing information without authorization, failing to fully disclose certain information before a customer consents to pay for goods or services, and misrepresenting material details of a sale. As part of this latest sweep of TSR enforcement, the FTC announced four newly filed actions:

  • In the first action, the FTC filed suit in the U.S. District Court for the Middle District of Florida against corporate and individual defendants alleged to have made illegal robocalls to “financially distressed consumers” with offers of “bogus credit card interest rate reduction services.”
  • In the second action, the FTC filed suit in the U.S. District Court for the Central District of California against individual and corporate defendants accused of using illegal robocalls to sell “fraudulent money-making opportunities.”
  • The third action, filed on the FTC’s behalf by the U.S. Department of Justice (DOJ) in the Middle District of Florida, targeted the “informational technology (IT) guy” alleged to have developed and operated computer-based “autodialer” technology used to make millions of illegal robocalls.
  • The fourth action, filed by the DOJ on the FTC’s behalf in the U.S. District Court for the Central District of California, alleges that a business and its individual owners sought to develop marketing leads for home solar energy companies by making millions of illegal robocalls and engaging in other abusive practices, including making more than 1,000 calls to a single telephone number in one year.

Continue Reading

The ICO’s take on explaining AI

The Information Commissioner’s Office (ICO) and the Alan Turing Institute have recently released an interim report (Report) outlining their approach to best practices in explaining artificial intelligence (AI) to users. The Report is of particular relevance to operators of AI systems who may be considering their duties under the General Data Protection Regulation 2016/679 (GDPR). In particular, operators of AI systems should be aware that articles 22 and 35 GDPR may be engaged by AI systems which involve automated decision-making.

The research commissioned for the Report established three key themes, which are outlined in the Report:

  • the importance of context in explaining AI-related decisions;
  • the need for improved education and awareness around the use of AI for decision-making; and
  • the challenges in deploying explainable AI, such as cost and the pace of innovation.

Explanation and AI utility

In this area, evidence was gathered from prospective users who faced a scenario which sought to understand whether their preference was for more accurate AI with limited explanation or less accurate AI that was more easily explainable. The results indicated that users preferred a more accurate AI system that is less well explained to them in a health care scenario. However, where AI systems are deployed in recruitment or criminal justice, users indicated a stronger preference for an explanation of those specific systems.

As well as prospective users, the research involved a number of key individuals from industry, regulators and academia (Stakeholders). The Stakeholders, in part, agreed with the findings from the users, although emphasised the need for explanations to promote user trust, help eliminate AI system bias, and improve on current human practices, which are subject to their own biases.


The Report also found that education around AI systems is a key area for building confidence in the decisions of AI systems. Education could be in the form of school lessons, TV and radio programming, or public awareness campaigns. The Report also identified that important topics to cover included how AI systems work, their benefits, and clearing up any misconceptions around AI.

The Stakeholders were more reserved in their approach although they acknowledged the need to clear up the misconceptions around AI. One concern raised included the risk that more information may confuse prospective users further.

Key challenges

The key challenge identified by the Report is the cost of complying with any transparency and explanation requirements imposed on operators of AI systems. Concerns were also raised about the potential for information overload if information about AI systems is provided in the same way as terms of use policies are today. The Report highlights the need, as much as is possible, for AI systems operators to translate complex decision-making processes into an appropriate form for a lay audience.


AI has been a hot topic for regulators recently. Last year, the ICO identified AI as a key priority in its Technology Strategy. AI has also recently been the subject of regulatory focus from the Council of Europe, Centre for Data Ethics and Innovation, European Union Agency for Network and Information Security and European Commission. The Report offers an interesting perspective on some of the concerns regulators will be addressing in the AI space. We expect further updates from the ICO and other regulators in this area and we will continue to keep you up to date on any relevant developments. If you would like to help the ICO in the development of its AI regulatory framework, you can contact the relevant ICO team here.