Sense or censorship? UK government publishes White Paper on tackling online harms

The UK Government has published a White Paper outlining its approach towards regulating the internet to tackle online harms.

The White Paper cites a study carried out by the UK’s communications regulator (Ofcom) and data protection regulator (Information Commissioner’s Office (ICO)). The study found that nearly one in four British adults suffered harm from either online content or their interactions online. Regulatory and voluntary initiatives currently dealing with online harms were identified by the UK Government as not going far enough or being inconsistently enforced.

Online harm

The White Paper broadly identified what would be considered an online harm. These include activities and content involving:

  • child sexual exploitation and abuse (CSEA)
  • terrorism
  • harassment
  • disinformation
  • encouragement of self-harm and/or suicide
  • online abuse of public figures
  • interference with legal proceedings
  • cyber-bullying
  • children accessing inappropriate content

Continue Reading

Sharing a Bounty of Personal Data? ICO issues £400,000 fine against UK pregnancy and parenting club for illegally sharing personal data

The Information Commissioner’s Office (ICO) announced its intent to fine Bounty (UK) Limited (Bounty) £400,000 for breaching the Data Protection Act 1998 (the Act). Due to the timing of this breach, it was governed by the Act rather than by the General Data Protection Regulation 2016/679 (GDPR). The maximum penalty permitted under the pre-GDPR regime in the United Kingdom was £500,000.


Bounty was a pregnancy and parenting support club. It provided information packs and goody bags to mothers in exchange for personal data. It also provided a mobile app for users to track their pregnancies, as well as offering a new-born portrait service. Its portrait service was the largest in-hospital service of its kind in the United Kingdom.

Bounty had a data protection policy on its website. The data protection policy stated that Bounty: (i) collected personal data for marketing purposes; and (ii) might share personal data with selected third parties. The data protection policy stated that users might receive communications from Bounty or a third party. However, the policy did not specifically identify third parties or the types of third parties that personal data would be shared with.

Bounty also collected personal data using hard copy cards completed in maternity wards. These cards stated that recipients consented to Bounty processing their personal data if the cards were filled in. The cards also briefly outlined the possibility that personal data could be shared by Bounty. However, again, no detail about third party recipients was included. Recipients were obligated to provide their names and postal addresses when filling the cards in. To avail of Bounty’s services, recipients had no choice but to provide some personal data. Continue Reading

Warnings issued against two organisations for breaching Singapore data protection law

On 23 April 2019, Singapore’s Personal Data Protection Commission (commission) issued two separate grounds of decision against PAP Community Foundation and Tutor City.

In both cases, the commission issued warnings to the organisations for breaching the protection obligation under section 24 of the Personal Data Protection Act (PDPA), but no financial penalty was imposed.

PAP Community Foundation (PCF)

The facts of this case were as follows:

  • PCF provides kindergarten services, and organises various school trips.
  • In connection with a particular school trip, a teacher at PCF sent a photograph of a consolidated attendance list to a WhatsApp chat group comprising parents of students of the school. The attendance list contained the personal data of 15 students and their parents, including the contact and National Registration Identity Card (NRIC) numbers of five of the parents.
  • A parent alerted the teacher of this unauthorised disclosure and the teacher quickly deleted the message within the group chat. The same parent lodged a complaint with the commission.

The commission’s findings were as follows:

  • It was evident that PCF did not have specific policies or procedures to guide its employees (including its teachers) on the use and disclosure of personal data in their communications with parents of students who were enrolled at the preschools.
  • Given the frequency of interaction between PCF’s staff and the parents, such policies and training should reasonably be expected to be put in place to guide the staff on how to comply with PCF’s data protection obligations.
  • While PCF had provided data protection training to its staff, mere training alone cannot be a substitute for data protection policies and procedures.
  • To its credit, however, PCF had acted swiftly to address their inadequate policies. This carried mitigating value. In particular, the commission noted that PCF had taken the following remedial measures:
    • Immediate suspension of all WhatsApp chat groups following the disclosure;
    • Expedited implementation of rules pertaining to the use of social media and WhatsApp chat groups;
    • Roll-out of data protection policies including document retention and information security policies; and
    • Development of a practical employee handbook and conducting refresher training for its employees.

Continue Reading

Protection of children’s online space: ICO issues code of practice on age-appropriate design

The UK Information Commissioner’s Office (ICO) issued a consultation on a draft code of practice for designing age-appropriate access for children accessing online products and services provided by information society services (ISS). The consultation closes on 31 May 2019. The draft code sets out principles for any online service accessed by children under the age of 18.

Best interests of the child at the core

This code of practice is based on the key principle in the United Nations Convention on the Rights of the Child that the best interests of the child should be a primary consideration in all actions concerning children. In the context of today’s myriad of online services, it has become increasingly difficult for both parents and children to make informed choices or exercise control over the way services use children’s personal data. The code aims to respect the rights and duties of the parents but also the children’s evolving capacity to make their own choices.

16 headline ‘standards of age-appropriate design’

The code requires ISS providers to abide by 16 cumulative standards when processing personal data of children through their services: Continue Reading

Danish DPA issues its first GDPR fine for late deletion of customer telephone numbers

Denmark’s Data Protection Authority Datatilsynet (DPA) recently recommended its first fine for a breach of the GDPR by the taxi company, Taxa 4×35 (Taxa), due to its over-retention of certain customer data.

Breach of the data minimisation principle

The Danish DPA found that Taxa did not adhere to the GDPR’s data minimisation principle by over-retaining personal data long after the envisaged retention limit for such data, thereby finding an affirmative duty to delete expired personal data. Taxa had deleted customers’ names and addresses after two years of retention but had retained customers’ telephone numbers for an additional three years. Taxa argued that telephone numbers were an essential part of its IT database and therefore could not be deleted in the same time span.

Continue Reading

Council of Europe issues recommendation on processing health-related data

The Council of Europe (CoE) recently issued its recommendation to member states on the protection of health-related data (Recommendation). The Recommendation guides member states to ensure that their law and practice reflect the principles of processing health-related data.

The recommendations stem from Convention 108 which was the first international treaty in the field of data protection. Like the General Data Protection Regulation 2016/679 (GDPR), Convention 108 sets out principles for processing health data, but contains fewer options than GDPR. The Recommendation’s principles related to health data align with GDPR, but in some cases provide more guidance about processing health-related data.

Some of the key recommendations on processing certain health-related data are below.

Continue Reading

Germany finally implements the Trade Secrets Directive!

With Germany finally implementing the Trade Secrets Directive into their national law, know-how theft cases are becoming more frequent.  Whilst questions have been raised around adequate protection for whistle blowers and journalists, many see the new laws as a positive move towards better know-how protection in Germany.

For more information, and to read Frankfurt IP Partner Anette Gärtner’s thoughts on this, please read Juve Patent’s Psst, top secret! article.

Processing publicly available personal data without telling data subjects? The Polish data protection authority has (bad) news for you…

The Polish Data Protection Authority (UODO) imposed its first fine for a violation of the General Data Protection Regulation 2016/679 (GDPR). Bisnode, a data aggregation company headquartered in Sweden, was fined just under PLN 1 million (around EUR 220,000). The decision found that Bisnode had failed in its duties to inform data subjects how it processes their personal data under GDPR article 14.

GDPR article 14

GDPR article 14 requires companies to explain to individuals how they process their personal data if the companies did not obtain it directly from the individuals. A typical example is companies gleaning information from open sources, such as a social media profiles. GDPR article 14 requires controllers to explain to affected individuals what personal data is processed, how it is processed, how long it will be retained, and so on.

GDPR article 14(5) contains exemptions from this obligation. Controllers are released from their obligation to inform affected individuals where “the provision of such information proves impossible or would involve a disproportionate effort” or where the obligation “is likely to render impossible or seriously impair the achievement of the objectives of that processing”.


Bisnode obtained personal data from public databases and registers in order to provide verification services and reports. The personal data focused on current and past entrepreneurs and business owners.

The data set under scrutiny by UODO contained approximately 7.6 million records of personal data. Bisnode was able to provide the correct privacy information to roughly 700,000 individuals where records included email addresses. Bisnode only had mobile numbers and postal addresses for the remaining individuals in the data set. Bisnode displayed a notice on its website for those individuals who did not receive a privacy notice by email.

Bisnode reasoned that the cost of sending privacy information to these individuals by post and/or SMS would have been disproportionate. The postage cost alone was estimated to be around PLN 33 million (around EUR 7.7 million). Bisnode explained to UODO that this cost would have been greater than its turnover last year. Further, the additional burden placed on the company to allocate staff and resources to prepare, send and manage responses posed a significant strain on resources. Bisnode claimed it could threaten Bisnode’s continued operations in Poland.


Despite this, UODO found that Bisnode had failed to discharge its GDPR article 14 obligation to inform data subjects how their personal data was processed. In its decision, UODO stated that contacting affected individuals would not be impossible or involve a disproportionately large effort. UODO further found that Bisnode’s knowledge of its GDPR obligations as a controller and its continued processing were aggravating factors. Even though no damage to data subjects was established, UODO did not consider this to be a mitigating argument. Of those data subjects informed by Bisnode, around 12,000 objected to the use of their data. The decision states that the fine was set at a high level to deter companies from accounting for such fines as an operational cost.


Interestingly, UODO chose not to publish the identity of Bisnode in the decision. Bisnode later published an online statement revealing its involvement. This decision raises more questions than it answers and illustrates the need for clarity and consistency among EU regulators.

UODO’s decision sets a high bar for the use of GDPR article 14(5) exemptions. It is now unclear when these exemptions may reasonably be relied on. The tension between practices such as data scraping and the rights of data subjects are difficult to resolve. Currently, the UK Data Protection Authority advises that if privacy information cannot be provided, a data protection impact assessment must be carried out. This would not seem to be in line with UODO’s approach.

The main takeaway is for companies that process personal data gathered from public sources to tread carefully. Be mindful of your GDPR article 14 notification obligations. Be sure you document your processing decisions, particularly if you decide not to inform affected individuals how you process their personal data. And most importantly, be prepared for regulatory scrutiny and engagement.