Protection of children’s online space: ICO issues code of practice on age-appropriate design

The UK Information Commissioner’s Office (ICO) issued a consultation on a draft code of practice for designing age-appropriate access for children accessing online products and services provided by information society services (ISS). The consultation closes on 31 May 2019. The draft code sets out principles for any online service accessed by children under the age of 18.

Best interests of the child at the core

This code of practice is based on the key principle in the United Nations Convention on the Rights of the Child that the best interests of the child should be a primary consideration in all actions concerning children. In the context of today’s myriad of online services, it has become increasingly difficult for both parents and children to make informed choices or exercise control over the way services use children’s personal data. The code aims to respect the rights and duties of the parents but also the children’s evolving capacity to make their own choices.

16 headline ‘standards of age-appropriate design’

The code requires ISS providers to abide by 16 cumulative standards when processing personal data of children through their services: Continue Reading

Danish DPA issues its first GDPR fine for late deletion of customer telephone numbers

Denmark’s Data Protection Authority Datatilsynet (DPA) recently recommended its first fine for a breach of the GDPR by the taxi company, Taxa 4×35 (Taxa), due to its over-retention of certain customer data.

Breach of the data minimisation principle

The Danish DPA found that Taxa did not adhere to the GDPR’s data minimisation principle by over-retaining personal data long after the envisaged retention limit for such data, thereby finding an affirmative duty to delete expired personal data. Taxa had deleted customers’ names and addresses after two years of retention but had retained customers’ telephone numbers for an additional three years. Taxa argued that telephone numbers were an essential part of its IT database and therefore could not be deleted in the same time span.

Continue Reading

Council of Europe issues recommendation on processing health-related data

The Council of Europe (CoE) recently issued its recommendation to member states on the protection of health-related data (Recommendation). The Recommendation guides member states to ensure that their law and practice reflect the principles of processing health-related data.

The recommendations stem from Convention 108 which was the first international treaty in the field of data protection. Like the General Data Protection Regulation 2016/679 (GDPR), Convention 108 sets out principles for processing health data, but contains fewer options than GDPR. The Recommendation’s principles related to health data align with GDPR, but in some cases provide more guidance about processing health-related data.

Some of the key recommendations on processing certain health-related data are below.

Continue Reading

Germany finally implements the Trade Secrets Directive!

With Germany finally implementing the Trade Secrets Directive into their national law, know-how theft cases are becoming more frequent.  Whilst questions have been raised around adequate protection for whistle blowers and journalists, many see the new laws as a positive move towards better know-how protection in Germany.

For more information, and to read Frankfurt IP Partner Anette Gärtner’s thoughts on this, please read Juve Patent’s Psst, top secret! article.

Processing publicly available personal data without telling data subjects? The Polish data protection authority has (bad) news for you…

The Polish Data Protection Authority (UODO) imposed its first fine for a violation of the General Data Protection Regulation 2016/679 (GDPR). Bisnode, a data aggregation company headquartered in Sweden, was fined just under PLN 1 million (around EUR 220,000). The decision found that Bisnode had failed in its duties to inform data subjects how it processes their personal data under GDPR article 14.

GDPR article 14

GDPR article 14 requires companies to explain to individuals how they process their personal data if the companies did not obtain it directly from the individuals. A typical example is companies gleaning information from open sources, such as a social media profiles. GDPR article 14 requires controllers to explain to affected individuals what personal data is processed, how it is processed, how long it will be retained, and so on.

GDPR article 14(5) contains exemptions from this obligation. Controllers are released from their obligation to inform affected individuals where “the provision of such information proves impossible or would involve a disproportionate effort” or where the obligation “is likely to render impossible or seriously impair the achievement of the objectives of that processing”.


Bisnode obtained personal data from public databases and registers in order to provide verification services and reports. The personal data focused on current and past entrepreneurs and business owners.

The data set under scrutiny by UODO contained approximately 7.6 million records of personal data. Bisnode was able to provide the correct privacy information to roughly 700,000 individuals where records included email addresses. Bisnode only had mobile numbers and postal addresses for the remaining individuals in the data set. Bisnode displayed a notice on its website for those individuals who did not receive a privacy notice by email.

Bisnode reasoned that the cost of sending privacy information to these individuals by post and/or SMS would have been disproportionate. The postage cost alone was estimated to be around PLN 33 million (around EUR 7.7 million). Bisnode explained to UODO that this cost would have been greater than its turnover last year. Further, the additional burden placed on the company to allocate staff and resources to prepare, send and manage responses posed a significant strain on resources. Bisnode claimed it could threaten Bisnode’s continued operations in Poland.


Despite this, UODO found that Bisnode had failed to discharge its GDPR article 14 obligation to inform data subjects how their personal data was processed. In its decision, UODO stated that contacting affected individuals would not be impossible or involve a disproportionately large effort. UODO further found that Bisnode’s knowledge of its GDPR obligations as a controller and its continued processing were aggravating factors. Even though no damage to data subjects was established, UODO did not consider this to be a mitigating argument. Of those data subjects informed by Bisnode, around 12,000 objected to the use of their data. The decision states that the fine was set at a high level to deter companies from accounting for such fines as an operational cost.


Interestingly, UODO chose not to publish the identity of Bisnode in the decision. Bisnode later published an online statement revealing its involvement. This decision raises more questions than it answers and illustrates the need for clarity and consistency among EU regulators.

UODO’s decision sets a high bar for the use of GDPR article 14(5) exemptions. It is now unclear when these exemptions may reasonably be relied on. The tension between practices such as data scraping and the rights of data subjects are difficult to resolve. Currently, the UK Data Protection Authority advises that if privacy information cannot be provided, a data protection impact assessment must be carried out. This would not seem to be in line with UODO’s approach.

The main takeaway is for companies that process personal data gathered from public sources to tread carefully. Be mindful of your GDPR article 14 notification obligations. Be sure you document your processing decisions, particularly if you decide not to inform affected individuals how you process their personal data. And most importantly, be prepared for regulatory scrutiny and engagement.

New SEC guidance provides some clarity for digital asset issuers

On April 3, 2019, the U.S. Securities and Exchange Commission (SEC) took their first step towards providing greater clarity on the key question of how to evaluate whether transactions involving issuance or sales of digital tokens are sales of securities subject to U.S. securities laws and regulations.

The guidance was released in two parts:

  • First, the SEC’s Strategic Hub for Innovation and Financial Technology (FinHub) published the “Framework for ‘Investment Contract’ Analysis of Digital Assets” (Framework). In a public statement announcing its release, the SEC billed the Framework as an “analytical tool to help market participants assess whether the federal securities laws apply to the offer, sale, or resale of a particular digital asset.” 1
  • That same day, the SEC’s Division of Corporation Finance also issued a response to a no-action request submitted by TurnKey Jet, Inc., (TurnKey), a provider of air charter services. In the first SEC no-action letter addressing a blockchain-based project, the SEC indicated it would not pursue an enforcement action if TurnKey sold a digital token (TKJ) to its air charter-customers, under the circumstances outlined in TurnKey’s letter.

Taken together, these publications provide much-needed guidance in an area of law and technology wrought with uncertainty.

To review the full article, click here.

Cooperation and consistency? Nine months in, the EDPB reflects on GDPR

The European Data Protection Board (EDPB) has published a report (Report) assessing the implementation and enforcement of the General Data Protection Regulation (EU) 2016/679 (GDPR). The Report focusses on how the cooperation and consistency mechanisms are being used by EU supervisory authorities (SAs).

Cooperation mechanism

Where cases involve cross-border processing, SAs cooperate through:

  • Mutual assistance;
  • Joint operations; and/or
  • The one-stop shop mechanism.

Communication is facilitated through the Internal Market Information system (IMI). IMI is an IT system that facilitates confidential and structured communications between SAs.

One-stop shop

At the initial stages of a cross-border case, it is necessary for a lead SA to be determined. Cross-border cases may involve:

  • A controller or processor who has an establishment in more than one Member State; or
  • Data processing affecting individuals in more than one Member State.

The lead SA will lead the cooperation procedure among SAs and draft the initial enforcement decision. This will then be reviewed by other relevant SAs. For the data controller or processor, the lead SA will be its point of contact in relation to investigation and enforcement.

If there is a dispute about which SA should lead, the EDPB can issue a binding decision. At the date of publication of the Report, the EDPB has yet to exercise its dispute resolution function.

Forty-five one-stop shop procedures have been initiated since GDPR came into force. The EDPB believes that the limited number may be due to draft decisions being subject to national administrative procedural laws. The EDPB has recently seen the rate of starting one-stop shop procedures increase.

Mutual assistance

Often useful in one-stop shop procedures, mutual assistance allows for the provision of information and ‘any other measures for effective cooperation’ between SAs. IMI sets a response deadline of one month where mutual assistance is formally requested. The majority of mutual assistance requests have seen answers returned within 23 days, whether formal or informal.

Joint operations

GDPR also allows SAs to carry out joint operations and enforcement measures. Again, these can also be employed within the one-stop shop procedure. However, to date, no joint operations have been initiated.

Consistency mechanism

One of the key responsibilities of the EDPB is to ensure consistent application of GDPR across the EU. This has taken the form of publishing general guidelines and reports. However, the EDPB may at times be required to adopt consistency opinions and decisions to inform the decision of a SA at the national level. These opinions can be requested by the SA directly, or requested by the European Commission if it will affect more than one Member State. Since the coming into force of GDPR, the EDPB has adopted 29 opinions with three ongoing procedures.

GDPR at the national level

SAs have received 206,326 cases in the nine months since GDPR came into force. The majority of these have been complaints, with the second largest category being data breach notifications by controllers. SAs have also reported a general shortfall in budget and staffing. SAs expect an increase in the number of cases they will have to deal with this year.


The Report offers an interesting snapshot of how GDPR is bedding down across the EU. The limited time between the coming into force of GDPR and the Report’s publication means that data for some elements of assessment is limited. However, the EDPB has been clear that in certain areas, such as its role as a dispute resolution body, more intervention will be required. After an eventful initial nine months of GDPR, the next 12 promise to be just as interesting. Make sure to keep an eye on Technology Law Dispatch over the next year to keep fully up to date!

ICO investigates adtech awareness through fact finding forum

The Information Commissioner’s Office (ICO) recently published a summary report of its fact finding forum on data protection issues arising from advertising technology (adtech). Adtech is a term commonly used to refer to all technologies, software and services used for delivering and targeting online advertisements.

The ICO compiled responses from over 2,300 participants in an online survey, and conducted fieldwork with more than a hundred stakeholders (publishers, advertisers, start-ups, adtech firms, lawyers and citizens). The ICO highlighted three key challenges of adtech: (i) transparency, (ii) lawful basis and (iii) security.

Continue Reading

ENISA tackles AI head on

The European Union Agency for Network and Information Security (ENISA) recently published its report on ‘Security and privacy considerations in autonomous agents’.

Artificial intelligence (AI) and complex algorithms offer unlimited opportunities for innovation and interaction, but they also bring a number of challenges that should be addressed by future policy frameworks at the EU level – especially in light of the amount of available data.

One of the objectives of the study was to provide relevant insights for both security and privacy for future EU policy-shaping initiatives. We have summarised some of the key security and privacy recommendations from the report below.

Continue Reading