Cybersecurity Guidance on the Way for Small Businesses

On October 11, 2017, the House passed a bill that would provide guidance to small businesses on how to deal with cybersecurity issues. This legislation passed on the heels of a similar Senate bill that was approved just weeks before on September 28.

The NIST Small Business Cybersecurity Act (H.R. 2105) would require the Department of Commerce’s National Institute of Standards and Technology (NIST) to issue voluntary guidelines specifically addressing the needs of the many small businesses across the country. Much like other voluntary guidelines enacted by NIST, the guidance for small business would not seek to add additional regulation but rather would provide small business owners with a set of best practices to keep themselves safe in the constantly growing cyber world.

According to the bill, small businesses account “for 54 percent of all United States sales and 55 percent of jobs in the United States.” Not only do attacks targeting small and medium businesses account for a high percentage of cyberattacks in the U.S., according to the National Cyber Security Alliance, “[s]ixty percent of small businesses that suffer a cyberattack are out of business within 6 months.”

House sponsor Daniel Webster (R-Fla.), an owner of a multi-generational small business, stated that small businesses “are more susceptible to attacks” due to the limited cybersecurity resources and “tools they need to prepare for such an event.”. According to Webster, the NIST cybersecurity framework would act to “protect business owners, their employees, and their customer base, all while contributing positively to the economy.”

H.R. 2105 received bipartisan support and passed by a voice vote. The similar Senate bill, the MAIN STREET Cybersecurity Act (S. 770), also received bipartisan support.

Court Deals Blow to FTC’s Position on Unfair Data Security Practices

Over the last several years, the Federal Trade Commission (FTC) has regularly used its authority under Section 5 of the FTC Act to bring cases against companies due to their allegedly unreasonable data security measures. The FTC has paid particular attention to the safeguards that manufacturers have implemented in electronic devices sold to consumers.  Recently, D-Link Systems Inc., a router manufacturer, successfully challenged the FTC’s position that a Section 5 claim can be supported based solely on the existence of a data security vulnerability without any evidence that the vulnerability was actually exploited resulting in consumer harm.

The FTC’s Authority. Under Section 5 of the FTC Act, the FTC can investigate and obtain injunctive and equitable relief against companies that engage in unfair or deceptive acts or practices.  To establish that a company’s practices are unfair, the FTC must show that the practices cause or are likely to cause substantial injury to consumers that is not reasonably avoidable by them, and that is not outweighed by countervailing benefits to them.

The FTC’s Position is that “Unreasonable” Data Security Is an “Unfair” Practice. In its complaints, the FTC commonly alleges that a company’s unreasonable data security measures are an unfair act or practice that violates Section 5.  Typically, to support its position that consumers were harmed, the FTC points to evidence of both (a) a vulnerability created by the allegedly unreasonable data security practices, and (b) exploitation of such vulnerability to gain unauthorized access to data or systems.  It would seem that exploitation is necessary to create a nexus between a vulnerability and any consumer harm.  But, to the surprise of many, the FTC has also filed complaints against companies alleging only the existence of a vulnerability, without evidence that such vulnerability actually was exploited.  In at least two cases, the FTC has alleged that the risk of cyber attack from a vulnerability was alone enough to satisfy the Section 5 requirement that the practice “causes or is likely to cause substantial consumer injury.” Continue Reading

39th International Conference of Data Protection and Privacy Commissioners publishes Resolution on Data Protection in Automated and Connected Vehicles

The 39th International Conference of Data Protection and Privacy Commissioners in Hong Kong published a Resolution on Data Protection in Automated and Connected Vehicles, which sets out fundamental data protection requirements for the mobility of the future (“Resolution”). The Resolution proposes common international standards.

The Resolution addresses not only vehicle and equipment manufacturers, but also providers of personal transportation services, car rental providers, and providers of data driven services (e.g., speech recognition, navigation, remote maintenance or motor insurance telematics services), as well as standardization bodies and public authorities (“Addressees”). The Resolution expressly calls upon Addresses to “fully respect the users’ right to the protection of their personal data and privacy and to sufficiently take this into account at every stage of the creation and development of new devices or services”.

Following the German Federal Data Protection Commissioner’s earlier proposals for automated and connected vehicles of June 2017 (available in German language here), the Resolution describes how the rights of users should be protected. In particular, the Addresses are seriously urged to comply with the following 16 items: Continue Reading

European Commission calls for enhanced responsibility of online platforms for illegal content

Addressing the detection of and removal of illegal content from online platforms represents an urgent challenge for the digital society today. However, so far, there is no harmonised and coherent approach across the European Union. On 28 September 2017, the European Commission (“Commission”) published a communication titled „Tackling Illegal Content Online – Towards an enhanced responsibility of online platforms” (“Communication”). The Commission calls for a more aligned approach as it would make the fight against illegal content more effective. An aligned approach would also benefit the development of the Digital Single Market. The Commission stresses that online platforms carry a significant societal responsibility and shall, therefore, decisively step up their actions to address this problem.

Scope of the Communication

The Communication does not as such change the existing legal framework. It rather lays down a set of non-binding guidelines and principles for online platforms to step up the fight against illegal content online in cooperation with national authorities, Member States, and other relevant stakeholders: “It aims to facilitate and intensify the implementation of good practices for preventing, detecting, removing and disabling access to illegal content so as to ensure the effective removal of illegal content, increased transparency and the protection of fundamental rights online. It also aims to provide clarifications to platforms on their liability when they take proactive steps to detect, remove or disable access to illegal content (the so-called “Good Samarian” actions).”

The Communication does not only target the detection and removal of illegal content; but it also takes into account issues arising from removal of legal content (“Over-Removal”), which may impact the freedom of expression and media pluralism. Therefore, the Commission calls for adequate safeguards which shall properly prevent Over-Removal. Continue Reading

Irish High Court asks European Court to rule on legality of EU-US data transfers


On 3 October 2017, the Irish High Court held that it is up to the European Court of Justice (“ECJ”) to determine whether Standard Contractual Clauses (“SCCs”) are a valid method of transferring personal data outside of the EU in compliance with privacy law.  SCCs are widely used by businesses that transfer data from the EU to the US as a means to comply with European data protection laws.  They are intended to give EU citizens the same level of privacy and protection when their data is stored in the US, as when it is stored in the EU.

The case involves an Austrian lawyer, Max Schrems, who originally filed a complaint with the Irish Data Protection Commissioner (the “Commissioner”) challenging Facebook’s use of SSCs.  Schrems brought the case following revelations in The Guardian that the US National Security Agency had direct access to data on European users of Facebook stored in the US, as originally transferred from the EU.  Schrems argued that the Commissioner should order Facebook to suspend sending data to the US, claiming that the standard clauses were not adequate to protect privacy under EU legal standards due to a lack of safeguards against US government surveillance.

The Commissioner argued that the case should be referred to the ECJ to determine whether the Commission’s decision on standard clauses is consistent with the EU Charter of Fundamental Rights. Justice Caroline Costello agreed that there were “well-founded grounds” for challenging the European Commission decision to approve SCCs as valid data transfer channels. The Irish judge held that only the ECJ has the jurisdiction to rule on the validity of a European measure.

The case is the latest to question whether methods used by large tech firms such as Facebook, Google and Apple to transfer data outside the European Union, provide EU consumers sufficient protection from US surveillance. This case also affects other companies that store information across borders and seek to transfer it for business purposes.

Continue Reading

The SEC Announces Two New Initiatives to Address Digital Token Sales

At the end of September, the Securities Exchange Commission (“SEC”) announced two new initiatives to address cyber-based threats and protect retail investors. In the press release, the SEC outlined the creation of the Cyber Unit (“Unit”) and the Retail Strategy Task Force (“RSTF”).  The Unit will focus on targeting cyber-related misconduct.  The RSTF was established to deal with misconduct impacting retail investors.

To learn more about these two new initiatives, click here.

Germany’s new hate speech act in force: what social network providers need to do now

On 1 October 2017, the German Netzwerkdurchsetzungsgesetz (Network Enforcement Act, „NetzDG“) that we already reported on in April and May, entered into force (English version here). The NetzDG shall be an “act to improve enforcement of the law in social networks”, and aims at combating fake news and hate speech. Regulatory offences may be fined by up to EUR 5 million for individuals and up to EUR 50 million for the platform provider itself.

The NetzDG has been criticised since the beginning of the legislative process, as a great number of lawyers deem the law incompatible with the principle of freedom of expression and the upcoming EU E-Privacy Directive that will be effective 25 May 2018. Therefore, everyone is waiting in suspense for the first complaints brought up against this law to the German Federal Constitutional Court, or even the European Court of Justice.

We compiled the five key aspects of the NetzDG for social networks to make you NetzDG-ready. Continue Reading

Spanish DPA fines Facebook €1.2 million for data protection infringements

The Spanish Data Protection Authority (AEPD) has imposed a fine of €1.2 million against Facebook following its investigation into whether Facebook’s data processing activities were in accordance with the Spanish Data Protection Act (Law 15/1999) (the Act).

In its decision, the AEPD concluded that Facebook had committed serious breaches of the Act, as discussed further below.

Processing sensitive personal data for advertising purposes without consent

The AEPD held that Facebook did not obtain its users’ consent for the collection of their sensitive personal data in accordance with the requirements of the Act, since the consent obtained was not valid, express and in writing.

It was noted that Facebook uses the preferences of its users to profile them based on their sensitive personal data, and offer content in relation to that profile. However, Facebook did not establish a separate procedure for the treatment of sensitive personal data, as prior consent was not requested, and all personal data was used for profiling for advertising purposes by default. For example, when configuring a user’s profile, the “Basic and Contact Information” section includes options to “add your religious beliefs” and “add your political ideology”. However, no express consent is requested from Facebook regarding the use of this information for advertising purposes, nor is the user informed at any stage that their data will be used for that purpose. Continue Reading

ICO publishes draft guidance on contracts and liabilities under the GDPR

The UK’s Information Commissioner (ICO) has published draft GDPR guidance on contracts and liabilities between controllers and processors. The draft guidance is currently open for consultation,with responses due by 10 October 2017.

The purpose of the guidance is to help organisations understand what needs to be included in written contracts between controllers and processors under the General Data Protection Regulation (GDPR). It also looks at the responsibilities and liabilities of controllers and processors.

Written contracts

Under the GDPR, a written contract must be in place when a controller uses a processor to process personal data. This is not a new concept, as data processing agreements are already used to satisfy the security requirements under the Data Protection Directive (95/46/EC). The GDPR, however, is wider in scope and now sets out specific terms that must be included in such contracts; for example, the subject matter and duration of the processing, the nature and purpose of the processing, the type of personal data to be processed, the categories of data subjects, and the obligations and rights of the controller. See Article 28.3 of the GDPR and page 12 of the draft guidance for further details.

The GDPR also allows for the use of standard contractual clauses issued by the European Commission or supervisory authority (such as the ICO), and approved codes of conduct or certification schemes which processors can sign up to; however, these are not available yet. Continue Reading

Updated Draft of ePrivacy Regulation: Still Hampering Innovation

On 8 September 2017, the European Council published its first revisions (“Revised Draft”) to the draft EU ePrivacy Regulation (version COM(2017) 10 of 10 January 2017, “ePrivacy Regulation”). The Revised Draft is based on the discussions held in previous meetings of the European Union’s Working Party for Telecommunications and Information Society (“WP TELE”), and on comments provided by delegations.

The Revised Draft

The Revised Draft aims for clarifications compared with the previous draft by the European Commission, and outlines issues to be discussed in further WP TELE meetings. The Revised Draft does not make changes to the general scope addressed by the ePrivacy Regulation. It tries, however, to be clearer about the territorial scope of applicability of the ePrivacy Regulation, and also about excluding legal entities from the definition of data subjects. Even in the form of the Revised Draft, the ePrivacy Regulation seems like an “elephant” that hampers innovation in Europe.

In terms of tracking technologies and commercial communications, the proposed amendments include, inter alia, the following elements: Continue Reading