On 12 September 2019, the Committee of Ministers of the Council of Europe announced that an Ad hoc Committee on Artificial Intelligence (CAHAI) will be set up to consider the feasibility of a legal framework for the development, design and application of Artificial intelligence (AI). On the same day, the United Kingdom’s data protection supervisory authority, the Information Commissioner’s Office (ICO), released the latest in its series of blogs on developing its framework for auditing AI. The blog (here), published on 12 September 2019, focuses on privacy attacks on AI models. With interest in the development of an AI legal framework increasing, what does the ICO consider to be the data security risks associated with AI?
On 9 September 2019, the German Federal Ministry of Economic Cooperation and Development (Bundesministerium für wirtschaftliche Zusammenarbeit und Entwicklung – BMZ) introduced a new, state-regulated environmental label for “Green Button” (Grüner Knopf) certified textiles with a press release, available here. The BMZ also launched the official Green Button website, which is available in German at http://www.gruener-knopf.de/.
In a nutshell
The Green Button is a logo that serves as evidence that the textile products concerned were manufactured and placed on the market in a socially and environmentally sustainable manner. The state is responsible for determining the requirements for Green Button certification.
The Green Button is intended to help consumers and public procurement agencies in identifying such textile products. The logo can be attached directly to certified textile products to demonstrate that the products meet the demanding social and environmental requirements.
Late last week, the California legislature approved five bills intended to clarify the scope and required compliance obligations of the California Consumer Privacy Act (CCPA or the Act). Organizations now have just over three months to determine whether they need to comply with the newly amended CCPA, assess what their obligations are, and implement the policies, procedures, and operational changes necessary to comply with the law.
- The amendments clarify that, at least for 2020, this consumer privacy law will apply to personal information of employees, job applicants, and contractors and personal information collected through certain business-to-business interactions but only in certain respects.
- The amendments add flexibility to the processes that businesses may use for receiving and verifying consumer access and deletion requests.
- The amendments exclude from CCPA applicability certain processing of consumer report data is already governed by the federal Fair Credit Reporting Act.
- The amendments clarify how encryption and redaction may play into the private right of action for data breaches.
- The amendments confirm that properly deidentified or aggregate data is not personal information under the Act.
In July 2019, the UK privacy regulator, the Information Commissioner’s Office (ICO) issued a warning about the privacy implications of automated facial recognition technology (AFR). The ICO was concerned that AFR “represent[s] the widespread processing of biometric data of thousands of people as they go about their daily lives.”
The UK High Court recently handed down an important and timely decision in Bridges, R (on application of) v. Chief Constable of South Wales Police  EWHC 2341 (Admin). The Court ruled that the South Wales Police’s (SWP) use of AFR was proportional, lawful, and consistent with human rights and data protection laws. This was despite SWP’s use of AFR interfering with the privacy and data protection rights of affected individuals.
This is the world’s first legal challenge over the use of AFR.
SWP has trialled a particular type of facial recognition technology (AFR Locate) since April 2017 with a view to it being rolled out nationally. AFR Locate works by capturing digital images from CCTV feeds of the faces of members of the public. The images are then processed in real time to extract facial biometric information. This information is compared with facial biometric information of people on police watch lists. If there is no match between the images, the data is immediately deleted after being processed.
The legal challenge was brought by the human rights campaign group Liberty on behalf of a Cardiff resident, Ed Bridges. Mr. Bridges argued that SWP’s use of AFR contravened his human rights as well as data protection laws.
The question for the Court was whether the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR.
The High Court dismissed Mr. Bridges’ claim on all grounds. Leaving aside claims relating to the breaches of human rights laws, the key data protection findings were as follows.
- Justifiable processing: Biometric data captured by AFR is the personal data of people who are not on a police watch list. The Court considered that members of the public whose images are captured by AFR are sufficiently individuated from all others. Although this processing of the biometric data was also “sensitive processing,” the Court ruled that it was justified. For SWP to achieve its purpose of identifying persons on watch lists, biometric information must be processed in the first place. The processing of this data is necessary for SWP’s legitimate interests to detect and prevent crime.
- Law enforcement processing: Biometric data processing does not contravene the data protection principle that any personal data processing for law enforcement purposes must be lawful and fair.
- Data protection impact assessment (DPIA): SWP’s DPIA for AFR complied with UK law. The Court ruled that SWP’s DPIA set out a clear narrative that took account of the potential breaches of UK human rights laws. It also identified safeguards that are in place to determine what personal data will be retained and why.
- Appropriate use: The current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR. SWP’s use of AFR Locate is consistent with human rights and data protection laws.
It has always been difficult to achieve a regulatory balance between harnessing new technologies and safeguarding the privacy and data protection rights of affected individuals. This ruling is fact-specific and should not be interpreted as a UK-wide green light to use AFR. However, the ruling nonetheless provides much needed judicial clarity on AFR. The ICO, which has been critical of police and private use of facial recognition technology, has said it will review the judgment carefully. As such, we expect the ICO to issue further recommendations and guidance to law enforcement about how to deploy any future AFR technology. Keep an eye on this blog as we’ll be sure to keep you fully up-to-date.
The Summer 2019 Edition of the quarterly IT & Data Protection Newsletter by Reed Smith Germany has just been released:
In this edition we cover the following topics:
- ECJ and GDPR: Another decision hitting social media activities by companies
- EDPB does not opt for changes to EU standard contractual clauses
- EU Commission on implementation of GDPR
- Second German GDPR Implementation Act
- Frankfurt Court of Appeals: no general prohibition on bundling marketing consent to sweepstakes
- ECJ: landmark decision on sampling
- Frankfurt Court of Appeals ruling on influencer marketing and manufacturer tags
- Munich Regional Court: Affiliate links to be clearly labelled as advertisement
The newsletter also includes multiple recommendations for reading of publications of the European Data Protection Board and the German data protection authorities.
We hope you enjoy reading it.
In its response dated 3 July 2019 (Response; file no. 19/11351, available in German here) to an inquiry by members of the German parliament (Inquiry), the German government took stand on the current draft Regulation on Privacy and Electronic Communications (ePrivacy Regulation), and particularly on “tracking”. The German government summarises its assessment of the ePrivacy Regulation:
“Germany has declared its view at a session of the Council of the EU on 7 June 2019 in Luxembourg. The ePrivacy Regulation must guarantee a high level of protection that goes beyond the protection that the GDPR provides. The current draft does not achieve this objective. Germany cannot support the current draft.”
German government’s assessment of the ePrivacy Regulation
The Inquiry sought, among other things, the German government’s responses on (i) whether “tracking” should be regulated more extensively at an EU level and (ii) what specific amendments have to be made to the ePrivacy Regulation. Continue Reading
Robocalls: everyone receives one or two, but more likely dozens. While some are helpful, most are annoying, and the worst can result in financial fraud. While the FCC and Congress have been taking steps toward addressing the issue, state attorneys general (AGs) have taken the first major action to end unwanted robocalls. On August 22, AGs from all 50 states plus the District of Columbia, together with 12 large telecom companies, announced a new framework to address the billions of unwanted robocalls that consumers receive every year.
The announced memorandum of understanding is a voluntary set of principles, under which the telecom companies agree to:
- Offer free call blocking and labeling.
- Implement STIR/SHAKEN, a call authentication system to combat illegal caller ID spoofing.
- Analyze and monitor high-volume voice network traffic.
- Investigate suspicious calls and calling patterns and take appropriate action, including notifying law enforcement.
- Confirm the identity of new commercial VoIP customers.
- Require traceback cooperation in contracts.
- Cooperate in traceback investigations.
- Continue cooperating with state AGs in devising further means of combatting illegal robocalls.
Although the voluntary principles do not include specific enforcement mechanisms, AGs may be able to enforce the principles under existing state laws. AGs already have the ability to initiate actions under existing consumer protection laws for deceptive conduct; if any of the participating telecom companies do not abide by the principles, they could face liability under those laws.
While participating telecom providers are likely already implementing plans to comply with these principles, the ripples from this announcement may affect other businesses as well. Due to the increased verification requirements, non-telecom companies can expect to provide their voice call providers with additional information during their next contract cycle or execute addendums ensuring compliance with these principles. And companies in all industries should be aware that state AGs have powerful tools in their arsenals to protect consumers.
Many states are following in the footsteps of Illinois’ Biometric Information Privacy Act (BIPA), a law that has led to an increase in the volume of class action privacy litigation and highlighted the importance of enterprise-level management of biometric data (e.g., fingerprint, voiceprint, and retina, facial, or iris image). Organizations that collect and use biometric data for employee tracking or consumer-facing uses (including the collection and use of characteristics like heart rate or step counts) should be aware of growing trends in biometric privacy laws (and associated risk of potential follow-on class actions) and should be proactive by evaluating their compliance with existing and soon-to-be-effective laws and anticipating new laws on the horizon in other states.
Earlier this year, in Rosenbach v. Six Flags Entertainment Corporation, the Illinois Supreme Court ruled that a plaintiff need only plead a violation of BIPA to be considered “aggrieved” as the statute expressly requires to maintain a private right of action. See 740 Ill. Comp. Stat. Ann. 14/20. Since then, several lawsuits have sprung up under BIPA, including by employee-plaintiffs for employer actions such as the collection and use of fingerprints for the tracking of time worked or security controls. Multiple states have either passed or introduced BIPA-like bills that are indicative of increased risk for organizations that collect, use, and store the biometric information of either their employees or customers.
Numerous states have followed in BIPA’s footsteps by either expanding the definition of “personal information” under state data breach notification laws to include biometric information or creating new rights (including private rights of action) for biometric data subjects. Two high-profile examples of recent privacy laws that incorporate biometrics are New York’s SHIELD Act and the California Consumer Privacy Act (CCPA) (which is not yet effective and faces continued amendments). If amendments to the CCPA expand the currently contemplated private right of action to encompass any violation of the law, the litigation landscape could be similar to Illinois’ under BIPA. As indicated below, there is a concerted effort by a growing number of states to augment organizations’ obligations to be transparent and restrict their ability to collect, store, and transact with biometric data without sufficient notice and consent. Below is a high-level overview of the current landscape of state laws that touch on biometric data and their status.
|In effect||Passed but not yet effective||Pending or on the horizon|
|Illinois: 740 Ill. Comp. Stat. 14/1 et seq. (BIPA): Includes private right of action and low threshold for allegation of injury; noncompliance can result in $1,000 – $5,000 for each improper collection of biometric data; applies to employee data.
Texas: Tex. Bus. & Com. Code Ann. section 503.001(b) (2009), or the Capture or Use of Biometric Identifier Act (CUBI): Requires organizations to provide notice and obtain consent before collecting biometric data and precludes the sale, lease, and disclosure of biometric data for commercial purposes without appropriate consent or disclosure. No private right of action (only the Texas attorney general can recover civil penalties).
Washington: Wash. Rev. Code 19.375.010: Collection and use of biometric identifiers without notice or consent and for a commercial purpose is prohibited, but collection and use for “security purposes” are excluded. No private right of action (only the Washington attorney general can enforce).
|Arkansas: Amends previous Personal Information Protection Act to expand the definition of “personal information” to include biometric data and applies data breach notification requirements to biometric data. No private right of action.
California: Cal. Civ. Code section 1798.100 (effective January 1, 2020) (CCPA): Defines “personal information” to include biometric information, and requires covered businesses (any for-profit entity that collects a consumer’s personal information and does business in California with an annual gross revenue in excess of $25 million; buys, receives, shares, or sells the personal information of more than 50,000 customers; or derives 50 percent or more annual revenue from selling consumers’ personal information) to disclose to consumers information about the collection of their personal data. As noted above, the CCPA continues to face a shifting landscape of amendments prior to the effective date.
|The following cities and states have proposed legislation that seeks to expand compliance obligations with respect to biometric information collected from consumers or employees in a manner similar to BIPA, but these laws remain pending (e.g., in committee or debate):|
Comment and practical implications
In order to mitigate the risk of litigation and class actions springing from the proliferation of these laws, particularly in the states where private rights of action are contemplated, organizations should be proactive and apply lessons learned from BIPA compliance to their continued preparedness for current and upcoming biometric laws, especially CCPA. The time and resources allocated to a refresh of policies and procedures could be far less costly than reacting after the fact to litigation spawned from one of the many biometric privacy laws on the horizon. Most important, organizations that have or anticipate having employees or consumers in the states listed above should consider: (1) their written policies and procedures regarding the collection, use, storage, retention, and deletion of personal information and biometric information in particular, including how those practices are reflected in the enterprise-wide retention schedule and how that data is secured within company systems and applications; (2) what, if any, notice and consent framework the business has in-place for obtaining biometric data from employees or customers; (3) fully evaluating the current or anticipated use-cases for collected biometric information (including limitations and restrictions against third-party disclosure or sale and evaluating what, if any, third-party vendors interact with biometric data collected by the business and for what purpose); and (4) ensuring that vendor diligence and contracting sufficiently address compliance with all applicable laws and that responsibility is appropriately allocated for contingencies such as a data breach or limitation of sale of biometric data to the extent that third parties do interact with biometric data collected by the company.
|In its recent decision of 11 June 2019 (docket no.: 4 U 760/19, available here), the Dresden Court of Appeals (Oberlandesgericht Dresden – Court of Appeals) had to decide on claims for damages under Article 82 GDPR with regard to minor violations of the GDPR.
The defendant, the provider of a social network, had deleted a post from the plaintiff and suspended the plaintiff’s user account for three days. The plaintiff asserted, inter alia, material and non-material claims for damages under Article 82 GDPR.
The Court of Appeals’ decision
The Court of Appeals dismissed the asserted claims under Article 82 GDPR.
Article 82 (1) GDPR provides that:
The Court of Appeals ruled that the requirements of Article 82 (1) GDPR were not fulfilled.
Second, the Court of Appeals stated that the suspension of the account did not constitute a damage in the meaning of Article 82 GDPR, as not even the loss of personal data as such constituted a damage. In the view of the Court of Appeals, the three-day suspension constituted only a minor violation. The Court of Appeals found that claims for damages under Article 82 (1) GDPR may not be asserted for only minor violations.
The Lower Saxony Data Protection Authority (Lower Saxony DPA) has audited 50 large and medium-sized organizations over the last couple of months regarding their implementation of the requirements of the General Data Protection Regulation (GDPR), and is currently finalising the audits. On 7 August 2019, the Lower Saxony DPA released the checklist that it used in assessing the organisations’ GDPR readiness (Checklist; available in German here).
In total, the Checklist consists of 10 categories of questions and about 200 GDPR compliance criteria. These include, for example:
|Category of questions||Main GDPR compliance criteria include|
|GDPR readiness||· How did your organisation prepare for GDPR?
· Which departments of your organisation have been involved in GDPR preparation?
· Did your organisation train employees on GDPR?
|Records of processing activities (ROPAs)||· How did your organisation ensure that it created ROPAs for all necessary processing activities?
· How does your organisation ensure that it updates its ROPAs?
|Legal bases for data processing||· What are the legal bases for your organisation’s processing activities?
· Does your organisation document consents obtained?
|Data subject rights||· What processes does your organisation have in place to ensure that data subjects can assert their rights under GDPR?
· Please explain, in particular, how your organisation complies with its information obligations.
|Data security||· How does your organisation ensure that it has implemented the technical and organisational measures (TOMs) necessary to ensure a level of security appropriate to the risk?
· How does your organisation ensure that the TOMs are state of the art?
· How does your organisation ensure that it has a documented authorisation concept for current and future IT applications?
· How does your organisation ensure that the concepts of privacy by design and privacy by default are implemented in the process of creating or changing goods or services?
|Data protection impact assessment (DPIA)||· How does your organisation ensure that it recognises that a processing activity requires a DPIA?
· For what processing activities did your organisation determine that a DPIA is necessary?
|Data processing agreements||· Did your organisation update existing agreements with data processors?
· Does your organisation’s template data processing agreement meet all GDPR requirements?
|Data protection officer (DPO)||· How is the DPO integrated within your organisation?
· Has your organisation documented that the DPO has sufficient data protection knowledge?
· Was the DPO notified to the supervisory authority?
|Data breach notifications||· What is your organisation’s process for ensuring notification of data breaches within the statutory deadline?|
|Accountability||· How does your organisation demonstrate compliance with the requirements listed above?|
According to the Lower Saxony DPA (see statement from last year here), the main objective of its audits was not issuing fines, but determining where organisations still have compliance gaps and raising awareness of GDPR requirements. These audits and the publication of the Checklist show that, one year after the GDPR entered into force, supervisory authorities are becoming more active (e.g., by conducting general audits of organisations’ GDPR readiness), and so organisations should be finally prepared.
The Checklist is a helpful tool for organisations to review their own GDPR readiness as it highlights the main topics that supervisory authorities might focus on.