Beginning in May 2022, employers in New York state will be required to make certain disclosures to their workers if they engage in electronic monitoring of employee communications. On November 8, a bill signed into law by Governor Kathy Hochul requires that all employers provide written notice to newly-hired employees if they intend to monitor or otherwise intercept employee emails, text messages, telephone conversations, Internet access, or usage of an electronic device or system. Read more about New York’s new notice requirement and civil penalty regime on our Employment Law Watch blog.
In one of the most highly anticipated judgments in recent years, the UK Supreme Court has unanimously rejected a class-action style compensation claim under the Data Protection Act 1998. The Supreme Court decision was handed down as a result of a claim raised against Google LLC (Google) by Richard Lloyd on behalf of four million data subjects.
The Federal Trade Commission (FTC or Commission) has issued a final rule clarifying its data security requirements for certain covered financial institutions. The new rule, which amends the Safeguards Rule originally promulgated in 2002 under the Gramm-Leach-Bliley Act (GLBA), outlines specific criteria to be incorporated as part of GLBA-covered financial institutions’ information security programs. The primary changes include:
- A requirement to designate a single qualified individual responsible for overseeing the information security program and periodically reporting to the board (or other governing body)
- Identification of specific security risk assessment criteria and a requirement that such assessments be documented in writing
- Specific required safeguards, including access controls, encryption, data disposal procedures, continuous monitoring, and penetration testing
- Service provider selection criteria and a related requirement to periodically assess service providers based on perceived risk
- Expansion of the definition of “financial institution” to clarify that it includes entities providing “finder” services incidental to financial activities
The updated rule takes effect 30 days after publication in the Federal Register, but some of the more significant new requirements will not take effect for another year.
On 7 September 2021, the High Court granted a defendant’s application for summary judgment in a claim for compensation brought by three data subjects resulting from a data breach suffered by the defendant, on the basis that the breach was ‘trivial’ (here).
The case related to a single email (with attachments) sent by the defendant, a firm of solicitors. The defendant, who represents a school to whom the claimants, a set of parents, owed outstanding school fees, had been instructed to write to the claimants with a demand for payment. The email consisted of a letter and a copy of the statement of account.
Due to one letter difference in one of the email addresses, the correspondence was sent to an unintended recipient. The unintended recipient responded promptly, indicating that they thought the email was not intended for them. The defendant then responded promptly, asking the unintended recipient to delete the email, which they agreed to do. The recipient was unknown to the claimants personally.
The email contained the claimants’ names, address and the amount of school fees owed, as well as reference to proposed legal action, but it did not contain any financial information in the form of bank or card details, or information about the income or financial position of the claimants.
The claim brought by the claimants was for, amongst other things, compensation for non-material damage (i.e., distress) under article 82 of the General Data Protection Regulation ((EU) 2016/679) (GDPR) and section 169 of the Data Protection Act 2018. This was based on (i) the claimants having suffered “lost sleep”, (ii) the breach having “made them feel ill” and (iii) extensive time having been spent by the claimants dealing with the issue.
On 18 October 2021, the European Commission (the Commission) launched a public consultation on adapting the civil liability rules for the digital age, with a specific focus on challenges arising from the adoption of artificial intelligence (AI).
Why the civil liabilities rules need to change
While Product Liability Directive 85/374/EEC (Directive) sets out rules aimed to ensure that injured parties are compensated for damage caused by defective products, the Commission has previously noted in a report in 2018 and the IIA that the Directive is no longer fit for the digital age. Challenges include:
- Whether and how intangible digital elements such as software can be classified as products
- The lack of clarity on who should be liable for defects after products are put into circulation
- Significant obstacles for injured parties to obtain compensation, especially given the difficulties in establishing causal links where the behaviours of AI systems are partially or wholly opaque
On 13 October 2021, the European Data Protection Board (EDPB) adopted the final version of its Guidelines (10/20) on restrictions of data subject rights under article 23 of the General Data Protection Regulation ((EU) 2016/679) (GDPR) (the Guidelines) during its forty-third plenary session. The adoption comes after a public consultation on the EDPB’s draft guidelines, which was concluded in February 2021. The Guidelines aim to provide clarity on the application of article 23 of the GDPR.
Article 23 GDPR
The rights of data subjects under the GDPR are set out in articles 5, 12 to 22 and 34. Article 23 lists the conditions under which EU member states can restrict these rights, by legislative measures, to protect the rights and freedoms of others; for example, in relation to safeguarding national and public security, enforcement of civil law claims, and protection of judicial independence, among others.
Following the publication of the Guidelines in a press release on 19 October 2021, the EDPB specified that the Guidelines:
- aim to recall the conditions surrounding the use of such restrictions by EU member states or the EU legislator in light of the EU Charter of Fundamental Rights and the GDPR;
- provide a thorough analysis of the criteria to apply restrictions, the assessments that need to be observed, how data subjects can exercise their rights after the restrictions are lifted, and the consequences of infringements of article 23 GDPR; and
- analyse how the legislative measures setting out the restrictions need to meet the foreseeability requirement and examine the grounds for the restrictions listed in article 23 GDPR, and the obligations and rights which may be restricted.
Notably, there is no definition of ‘restrictions’ in the GDPR. The Guidelines, however, define the term ‘restrictions’ as any limitation of scope of the obligations and rights provided for in articles 12 to 22 and 34 GDPR, as well as the corresponding provisions of article 5.
The Guidelines state that the restrictions to rights concern the right to transparent information, right to information, right of access, right to rectification, right to erasure, right to restriction of processing, notification obligation regarding rectification or erasure of personal data or restriction of processing, right to data portability, right to object, and right not to be subject to automated individual decision making. Any other data subject rights, such as the right to lodge a complaint to the supervisory authority, or other controllers’ obligations cannot be restricted. Any restrictions should be seen as exceptions to the general rule allowing the exercise of rights and imposing the obligations enshrined in the GDPR. Restrictions should be interpreted narrowly, and only be applied in specific circumstances and only when certain conditions are met.
Further, the Guidelines note that restrictions must pass a necessity and proportionality test in order to be compliant with the GDPR, and that this test should be carried out before the legislator decides to provide for a restriction. As such, restrictions that are extensive and intrusive, to the extent that they void fundamental rights, cannot be justified.
On October 5, 2021, California Governor Gavin Newsom signed into law amendments to the California Consumer Privacy Act (CCPA) via Assembly Bill 694. Businesses are eagerly awaiting clarification on many aspects of the CCPA and the California Privacy Rights Act (CPRA) (the CPRA is set to go into effect on January 1, 2023, with a 12-month look-back period).
However, Assembly Bill 694 does not provide the clarification everyone is awaiting. The bill’s effect is limited to a number of technical and grammatical corrections to the CCPA, which likely will not have material effect on current compliance measures.
The most significant change is the amendment to Section 1798.199.40 which discusses the operations of the California Privacy Protection Agency (CPPA). Assembly Bill 694 clarifies the timeline for when the CPPA must begin its rulemaking process. In the previous version of the CCPA statute, the text contained two contradictory statements. One statement said that that the CPPA had until the later of July 1, 2021, or six months after notifying the Attorney General that it is prepared to assume rulemaking responsibilities to begin the process. The other statement said it was the earlier of the two dates. Assembly Bill 694 clarifies that the agency has until the later of the two dates. Since July 1, 2021 has already passed, this amendment indicates that the new rules will not come out for at least another six months. The timing of the CPPA’s rulemaking authority is very important because it is expected that the CPPA will be very active in implementing new privacy rules.
* * *
California remains particularly active in regulating privacy, as it has been for a couple of decades, even as the federal government has been unable to enact national legislation. Other recent developments in California include the enactment of the Genetic Information Privacy Act earlier this month.
On October 6, 2021, the Department of Justice (DOJ) announced the launch of its new Civil Cyber-Fraud Initiative that emphasizes accountability for conduct that could increase cybersecurity threats to the government. This initiative supports the Biden administration’s goals and efforts to improve U.S. cybersecurity generally. Those who do business with the government or receive federal funds need to be mindful of the updated compliance expectations this initiative poses. Our government contracts and national security teams discuss these risks in detail on our Global Regulatory Enforcement Law Blog.
The European Court of Justice (ECJ) ruled on 6 October 2021 in Top System SA v. Belgian State (Case C‑13/20) EU:C:2021:811 that, under article 5(1) of the Software Directive (Council Directive 91/250/EEC) (the Directive), lawful purchasers of software are permitted to decompile programs (in whole or in part) in order to correct errors affecting the software’s operation.
The decision comes as the result of a request for a preliminary ruling by the Brussels Court of Appeal. The request had been made in proceedings between Top System SA and the Belgian state concerning the decompilation by the Selection Office of the Federal Authorities in Belgium (SELOR) of a computer program developed by Top System and forming part of an application in respect of which SELOR holds a user licence.
What the Directive says
Article 4 of the Directive deals with “Restricted Acts” that give developers exclusive rights to reproduce and alter computer programs, whereas article 5 allows the licensor to reproduce and alter a program where necessary to use it for its intended purpose, including for error correction. Article 6 deals with decompilation, permitting the reproduction of software code and translation where doing so is indispensable to obtain the information necessary to achieve interoperability so long as: it is done by the licensee or other authorised person; the information necessary to achieve interoperability is not readily available to the licensee; and any related actions taken are limited to those portions of the original software/computer program necessary to achieve interoperability.
Decompilation: ECJ’s ruling
In Top System, the ECJ ruled that under an interpretation of article 5 the lawful purchaser of a computer program is entitled to decompile the program (in whole or in part) in order to correct errors affecting its operation, without being required to satisfy the requirements of article 6. The licensee would not be allowed to use the decompiled software for any other purpose than error correction.
Advocate general’s opinion
The advocate general’s opinion on the case confirmed that a licensee could decompile a computer program to correct errors, unless restricted by the licence. The opinion emphasised the independence of articles 5 and 6, and the possibility of decompilation under article 5, as well as article 6. Specifically, the opinion stated that article 5, independent of article 6 (which permits decompilation), should be interpreted as permitting a licensor to decompile a computer program where necessary to correct errors affecting its functioning.
The lesson from this ECJ ruling is that a computer program can be decompiled where necessary to fix an error under article 5 and that that right is independent of the article 6 right to decompile a program when necessary for interoperability. While the case should not be seen as opening the floodgates for decompiling software by a licensor, it does offer helpful clarity as to the rights and obligations of both the licensor and licensee when it comes to managing software errors.
To limit disputes around the decompilation of licensed software, the ECJ has advised that the procedure for correcting software errors should be addressed in the licence and contract provisions. Although the parties are not permitted to exclude the possibility of correcting errors altogether, a contractual arrangement will allow licensees and licensors to find a method best suited to the objectives of each party.
In July 2021, the European Commission (the Commission) adopted three proposals for regulations and one proposal for a directive of the European Parliament and of the Council in relation to reforms to the EU’s anti-money laundering (AML) and counter-terrorist financing (CTF) regime. The proposals serve to implement aspects of the Commission’s May 2020 action plan in respect of the same, with a view to addressing weaknesses in these areas. The key reforms include a new EU AML and CTF authority and a new EU single AML and CTF rulebook.
On 22 September 2021, the EU’s independent data protection authority, the European Data Protection Supervisor (EDPS), Wojciech Wiewiórowski, published an opinion on the Commission’s proposals, alongside a press release.
Overall, the EDPS’ opinion of the proposals is positive, welcoming the AML package and its objective to increase the effectiveness of AML and CTF. In particular, Mr Wiewiórowski praised the envisaged increased harmonisation of the AML and CTF framework at EU level, which includes the creation of a European authority. Continue Reading