|In its recent decision of 11 June 2019 (docket no.: 4 U 760/19, available here), the Dresden Court of Appeals (Oberlandesgericht Dresden – Court of Appeals) had to decide on claims for damages under Article 82 GDPR with regard to minor violations of the GDPR.
The defendant, the provider of a social network, had deleted a post from the plaintiff and suspended the plaintiff’s user account for three days. The plaintiff asserted, inter alia, material and non-material claims for damages under Article 82 GDPR.
The Court of Appeals’ decision
The Court of Appeals dismissed the asserted claims under Article 82 GDPR.
Article 82 (1) GDPR provides that:
The Court of Appeals ruled that the requirements of Article 82 (1) GDPR were not fulfilled.
Second, the Court of Appeals stated that the suspension of the account did not constitute a damage in the meaning of Article 82 GDPR, as not even the loss of personal data as such constituted a damage. In the view of the Court of Appeals, the three-day suspension constituted only a minor violation. The Court of Appeals found that claims for damages under Article 82 (1) GDPR may not be asserted for only minor violations.
The Lower Saxony Data Protection Authority (Lower Saxony DPA) has audited 50 large and medium-sized organizations over the last couple of months regarding their implementation of the requirements of the General Data Protection Regulation (GDPR), and is currently finalising the audits. On 7 August 2019, the Lower Saxony DPA released the checklist that it used in assessing the organisations’ GDPR readiness (Checklist; available in German here).
In total, the Checklist consists of 10 categories of questions and about 200 GDPR compliance criteria. These include, for example:
|Category of questions||Main GDPR compliance criteria include|
|GDPR readiness||· How did your organisation prepare for GDPR?
· Which departments of your organisation have been involved in GDPR preparation?
· Did your organisation train employees on GDPR?
|Records of processing activities (ROPAs)||· How did your organisation ensure that it created ROPAs for all necessary processing activities?
· How does your organisation ensure that it updates its ROPAs?
|Legal bases for data processing||· What are the legal bases for your organisation’s processing activities?
· Does your organisation document consents obtained?
|Data subject rights||· What processes does your organisation have in place to ensure that data subjects can assert their rights under GDPR?
· Please explain, in particular, how your organisation complies with its information obligations.
|Data security||· How does your organisation ensure that it has implemented the technical and organisational measures (TOMs) necessary to ensure a level of security appropriate to the risk?
· How does your organisation ensure that the TOMs are state of the art?
· How does your organisation ensure that it has a documented authorisation concept for current and future IT applications?
· How does your organisation ensure that the concepts of privacy by design and privacy by default are implemented in the process of creating or changing goods or services?
|Data protection impact assessment (DPIA)||· How does your organisation ensure that it recognises that a processing activity requires a DPIA?
· For what processing activities did your organisation determine that a DPIA is necessary?
|Data processing agreements||· Did your organisation update existing agreements with data processors?
· Does your organisation’s template data processing agreement meet all GDPR requirements?
|Data protection officer (DPO)||· How is the DPO integrated within your organisation?
· Has your organisation documented that the DPO has sufficient data protection knowledge?
· Was the DPO notified to the supervisory authority?
|Data breach notifications||· What is your organisation’s process for ensuring notification of data breaches within the statutory deadline?|
|Accountability||· How does your organisation demonstrate compliance with the requirements listed above?|
According to the Lower Saxony DPA (see statement from last year here), the main objective of its audits was not issuing fines, but determining where organisations still have compliance gaps and raising awareness of GDPR requirements. These audits and the publication of the Checklist show that, one year after the GDPR entered into force, supervisory authorities are becoming more active (e.g., by conducting general audits of organisations’ GDPR readiness), and so organisations should be finally prepared.
The Checklist is a helpful tool for organisations to review their own GDPR readiness as it highlights the main topics that supervisory authorities might focus on.
Earlier this year, the Information Commissioner’s Office (ICO) issued a consultation on a draft code of practice for designing age-appropriate access for children accessing online services (Code). The consultation closed on 31 May 2019 but the ICO has recently released an update on its progress in producing the Code.
The finalised Code will be informed by more than 450 written responses and 40 meetings with key stakeholders during the consultation period.
In particular, processors in the tech, e-gaming and interactive entertainment industries should be on alert. The ICO’s update highlights that these industries may potentially face greater challenges with the introduction of the Code. The ICO is preparing a significant package of support for organisations to implement the Code, with specific help for designers and engineers.
The update also confirms that the Code will not segment the internet into age-related zones. To this end, the ICO “want[s] providers to set their privacy settings to ‘high’ as a default, and to have strategies in place for how children’s data is handled”. In particular, the Code will avoid creating barriers of access for children reading news content. The ICO believes that news plays a fundamental role in the lives of children.
The final version of the Code will be delivered to the Secretary of State for Digital, Culture, Media & Sport by 23 November 2019. This will be followed by a transition period of up to a year to allow organisations to bring their online products and services into compliance.
This update is useful in setting out the ICO’s initial thoughts on key issues that online providers should consider regarding children accessing their services. Although quite general, interested parties can review the ‘standards’ set out in the consultation for a better steer on the likely substance of the final Code. Firms in the tech, e-gaming and interactive entertainment industries should keep an eye out for the release of the Code by 23 November 2019. We look forward to receiving more substantive information from the ICO, in the form of either further updates or the finalised Code itself. In the meantime, keep an eye on our blog for any developments in this area.
Recently, the Berlin Data Protection Authority (Berlin DPA) announced that it would issue a high administrative fine for violations of the General Data Protection Regulation 2016/679 (GDPR). The announcement is available in German on the website of the City of Berlin. The fine will likely be a double-digit million amount of euros. The Berlin DPA further commented that it recently imposed two fines on one organisation in the aggregate amount of €200,000, but did not disclose any further details of the underlying GDPR violations.
The announcement of the Berlin DPA is a clear shift from the previous practice of German Data Protection Authorities of issuing much smaller fines. According to a report in the German newspaper Welt Am Sonntag published on 12 May 2019 (available here), German DPAs imposed 81 fines in the first year post-GDPR. These fines ranged from a few hundred euros to five-digit amounts, and totalled in aggregate €485,490.
The announcement of the Berlin DPA comes in the footsteps of the UK Information Commissioner’s Office’s announcement of its intention to issue separate fines in the amounts of €110 million and €205 million for data security violations (Article 32 GDPR), and the Italian Data Protection Authority imposing a fine of €2 million for telemarketing without consent.
Organisations should continue to close any GDPR compliance gaps and, in particular, be prepared to maintain sufficient documentation to comply with their accountability obligations under Article 5(2) GDPR.
Rack Room Shoes, Inc. (Rack Room) has agreed to pay up to nearly $26 million to settle a class action lawsuit alleging violations of the Telephone Consumer Privacy Act (TCPA). The lawsuit, Goldschmidt v. Rack Room Shoes, Inc., centers on claims that defendant Rack Room violated the TCPA when it initiated a text message campaign using an automatic telephone dialing system to target consumers without their express written consent.
According to the complaint, Rack Room owns and operates over 400 retail footwear stores across 24 states. Plaintiff is a Florida resident who received various text messages promoting Rack Room’s business and goods. He alleged that the “impersonal and generic nature” of the text messages and the use of a short code from where the text messages originated established that Rack Room utilized an automated text messaging platform to transmit those messages.
Plaintiff argued that the transmission of these text messages violated the TCPA, which prohibits telemarketing calls and texts to a wireless number using an automatic telephone dialing system without the recipient’s prior express written consent. Plaintiff claimed he never provided such consent to Rack Room and that he and other members of the putative class were each entitled to a minimum of $500 for each violation under the TCPA.
While denying liability, Rack Room agreed to the following to settle this matter:
- to make available a settlement fund up to $25.97 million;
- to provide a $10 rewards voucher to each class claimant; and
- to institute policies and procedures to ensure it complies with the TCPA.
Under the proposed settlement order, the certified class consists of those who received a text message from Rack Room – specifically those who enrolled in the Rack Room Reward Program or the Off Broadway Reward Program by providing their cell phone number and received a text message on or after April 2, 2014. According to the proposed settlement order, approximately 5.2 million individuals are members of the settlement class.
This proposed settlement highlights the costly risks of sending text messages to consumers without appropriate consent. Companies should carefully evaluate their marketing strategies and practices around the use of consumers’ cell phone numbers, and should ensure that appropriate express written consent is obtained before sending promotional text messages to consumers. Also of significant importance is ensuring that evidence of that consent is maintained by the company in order to rebut any claims that proper consent was not obtained.
The UK’s new prime minister, Boris Johnson, has vowed that the UK will leave the EU on October 31, 2019. A unilateral (or “hard”) Brexit poses many privacy and data protection challenges for companies that operate in the UK. Post-Brexit privacy and data protection issues that you need to consider include:
- how to maintain uninterrupted personal data flows between the EU and the UK;
- the UK’s status as a “third country” in the event of a no-deal Brexit, which will impede the transfer of personal data from the UK to the EU;
- whether companies selling into the UK need to appoint a local representative after Brexit;
- the impact Brexit will have on companies’ existing lead supervisory authority structures; and
- the future of eMarketing and ePrivacy laws in the UK.
To read more about these issues and what actions you should take before and following Brexit, click here.
On July 25, 2019, New York Governor Andrew Cuomo signed into law the Stop Hacks and Improve Electronic Data Security (SHIELD) Act (S.5575B/A.5635), which significantly increases obligations for businesses handling private data to notify affected consumers upon experiencing a security breach. Additionally, Governor Cuomo signed the Identity Theft Prevention and Mitigating Services Act (A.2374/S.3582), requiring consumer credit reporting agencies to offer identity theft prevention and mitigation services to consumers who have been affected by a security breach of the agency’s system.
In an official press release announcing his signature on both pieces of legislation, the Governor emphasized the significance of implementing such laws to protect New Yorkers against security breaches. Citing a recent significant data breach, Cuomo noted that “[a]s technology seeps into practically every aspect of our daily lives, it is increasingly critical that we do everything we can to ensure the information that companies are trusted with is secure . . . [t]he stark reality is security breaches are becoming more frequent and with this legislation New York is taking steps to increase protections for consumers and holding these companies accountable when they mishandle sensitive data.”
The Federal Trade Commission’s (FTC) recent $5 billion settlement with Facebook is unprecedented in multiple respects:
- The $5 billion penalty represents the largest privacy and data security settlement in history – it is almost 20 times larger than the recent Equifax Inc. settlement and dwarfs recent EU data protection enforcement actions.
- As part of the settlement, new corporate governance measures relating to privacy and data security will be required, including an independent committee of the board of directors, with specific nomination requirements and subject matter coverage. This will place pressure on many boards and organizations to freshly examine information governance risk.
- The settlement also requires executive certifications, which, if modeled by other companies, will trigger dramatic changes in accountability as executives turn to rely on experts, internal compliance teams, audit and related expertise for assurance and attestation in order to avoid civil and criminal penalties and derivative litigation.
The signaling effect of the settlement to the broader business community intended by the primary privacy regulator in the United States cannot be overstated. Similar enforcement actions, such as individual prosecutions in Europe under the EU Data Protection Directive, triggered immediate response and attention from corporations just as the emergence of breach notification laws resulted in massive new investments in information security programs in the United States.
The recently announced multistate settlement between credit reporting company Equifax Inc. and the Attorneys General of 48 states, Puerto Rico, and the District of Columbia (the AGs) demonstrates the increasingly active role of state regulators in policing the privacy and security practices of businesses that handle consumers’ personal information. The multistate settlement is part of a comprehensive agreement between Equifax, the AGs, and other state and federal regulators, under which Equifax will pay at least $575 million and up to $700 million to resolve investigations and litigation arising out of a 2017 data breach alleged to have affected over 147 million consumers. Continue Reading
The U.S. Chamber of Commerce (the “Chamber”) recently hosted a data privacy summit, “#DataDoneRight”, which brought together a group of industry professionals, government stakeholders, and privacy thought leaders to talk about data privacy.
The Chamber, which has proposed federal privacy legislation, engaged a wide variety of speakers, covering multiple viewpoints, to demonstrate the need for a comprehensive and fair federal privacy law:
- Alastair Mactaggart, the primary architect of the California Consumer Privacy Act (CCPA), explained his efforts to pass that law, including his negotiations with California legislators. Mactaggart expects that the CCPA would be a floor for any federal privacy law both because he claims consumers demand it and because the significant numbers of federal legislators from California (who comprise 20 percent of the House Democratic caucus alone) would not approve of any law that undercut the consumer rights granted by the CCPA. Mactaggart likened privacy to auto safety and emissions and tobacco use, all areas where public sentiment required regulatory change and consequent industry changes.
- FTC Commissioner Noah Phillips – speaking for himself, not officially for the FTC – strongly supports a comprehensive federal privacy law, with clear rules of the road from Congress. Although he believes the FTC is the appropriate authority for enforcement due to its years of privacy expertise, Commissioner Phillips hopes for only limited need for FTC rulemaking. The commissioner also expressed his support for a privacy law focused on avoiding actual consumer harms. He spoke against a private right of action and for gradual, calibrated penalties that would not stifle innovation, and perhaps even a system where violators are offered the opportunity to cure deficiencies before regulatory action occurs.
- Congresswoman Cathy McMorris Rodgers (R-Wa) advocated for a similar structure. She supports federal legislation, especially to avoid the negative effects that a patchwork of separate state privacy laws would have on small business and innovation, but believes Europe’s General Data Protection Regulation was the “wrong approach.” The congresswoman does not think a private right of action would benefit consumers – just attorneys – and highlighted a number of benefits that businesses’ use of data provides consumers, including time saving, more efficient and improved customer service, and targeted loyalty programs. She believes that data privacy is an issue where Congress can bridge partisan divides, but recognized that might be difficult with the current domination of the media by presidential politics.
- Georgia Attorney General Chris Carr concurs with Commissioner Phillips and Congresswoman McMorris Rodgers. AG Carr emphasized his state’s pro-business practices, noting that Georgia was voted the best state to do business in for six years in a row. That said, he also made it clear that being pro-business does not mean being anti-consumer. A state can support business while still having effective consumer protection. AG Carr suggested that, in the absence of necessary federal action, state governments would fill the void, creating a patchwork of state data privacy laws and leading to confusion in the business community.
As “#DataDoneRight” made clear, privacy remains a priority for legislators, regulators, industry, and consumer advocates. Although the diverse group assembled by the Chamber was unable to agree on when we might see a federal privacy law, they all agreed on the necessity of it. Without such law, however, industry will need to be prepared for the current patchwork, because states will continue to fill the gaps created by congressional inaction.