Recent rulings indicate Fifth Amendment may join Fourth Amendment as critical consideration in courts’ efforts to apply constitutional protections to smartphones and other new technology

The Fourth Amendment right of the people “to be secure in their persons, houses, papers, and effects” has been center stage in debates over technology that scarcely could have been imagined at the time it was written. See, e.g., Carpenter v. United States, 138 S. Ct. 2206 (2018); United States v. Jones, 565 U.S. 400 (2012). With less fanfare, however, the Fifth Amendment has emerged as another critical consideration in recent cases focused on the protection of information accessible only through biometric scans (such as fingerprint or facial recognition). In the latest example of this trend, the U.S. District Court for the Northern District of California found that the Fifth Amendment right against self-incrimination prohibited the compelled use of biometric smartphone unlocking features, such as fingerprint, thumbprint, facial, or iris recognition, in In the Matter of the Search of a Residence in Oakland, California, No. 4-19-70053, 2019 WL 176937 (N.D. Cal. Jan. 10, 2019). Cases like this one read the right even more broadly than those dealing with the compelled production of passwords. Practitioners should monitor this ongoing judicial dialogue about how the Fifth Amendment should apply to issues newly arising in the information age.

The Northern District of California’s Fifth Amendment analysis in Oakland

In Oakland, the Government applied for a warrant authorizing investigators to compel any individual present at a residence connected to two extortion suspects to utilize biometric features to unlock digital devices found at the residence. Relying on recent U.S. Supreme Court decisions directly addressing the Fourth Amendment, including Carpenter, U.S. Magistrate Judge Kandis A. Westmore ruled that law enforcement could not force suspects to use biometric features to unlock digital devices because using such a feature would be testimonial for purposes of the Fifth Amendment’s protection against self-incrimination. In addition, Judge Westmore ruled that the “foregone conclusion” exception did not apply. She thus denied the warrant application.

In her analysis of whether using biometric features would be testimonial, Judge Westmore was mindful of the fact that “technology is outpacing the law” in some areas. She noted the U.S. Supreme Court’s direction in Carpenter to take technological advances into account when addressing constitutional issues and noted that courts “have an obligation to safeguard constitutional rights and cannot permit those rights to be diminished merely due to the advancement of technology.”  Continue Reading

ICO brings prosecution against SCL Elections

Earlier this month, the Information Commissioner’s Office (ICO) brought a criminal prosecution against the parent company of Cambridge Analytica, SCL Elections, for failing to comply with an enforcement notice issued by the ICO. SCL was fined £15,000 and ordered to pay costs.

The criminal prosecution may not sound surprising – after all, SCL had failed to comply with an enforcement notice. Clearly the ICO is taking a hard-line approach to enforcement. SCL, however, was in administration at the time of the enforcement notice and therefore a key point to note here is that a company is still required to ensure it complies with its data protection responsibilities, including any enforcement, even when it is in administration.

Background

In January 2017, U.S. citizen Professor David Carroll made a subject access request to SCL. SCL responded disclosing some personal data, but Professor Carroll suspected that SCL had not disclosed everything. The response from SCL also contained inadequate information about where the data had been obtained and how it would be used. He complained to the ICO, who shared his concerns.

The ICO contacted SCL in September 2017 to ask for further information. SCL was not cooperative, incorrectly claiming that Professor Carroll had no legal right to access the data because he was not a UK citizen or based in the United Kingdom. In rejecting SCL’s claim that a U.S. citizen has no legal right to access the data, the ICO confirmed that “anyone who requests their personal information from a UK-based company or organisation is legally entitled to have that request answered, in full, under UK data protection law.”

Continue Reading

Financial penalty imposed for failure to protect personal data on website

On 22 January 2019, Singapore’s Personal Data Protection Commission issued its grounds of decision against COURTS (Singapore) Pte Ltd (Courts), a consumer electronics and furniture retailer in Singapore.

The facts of the case were as follows:

  • A complaint was brought by an individual who discovered that his contact number and address were disclosed in an automatically opened webpage, when he entered his name and email address on Courts’ guest login page when making a purchase on its website.
  • The commission’s investigations revealed that when a customer checked out as a guest user and entered their name and email address, their contact number and residential address would also be displayed on the guest checkout page of the website.
  • As of 9 July 2017, Courts confirmed that it had a total of 14,104 personal data sets stored in its database.

Continue Reading

“Worst breach of personal data in Singapore’s history” attracts highest penalties totalling S$1 million

On 14 January 2019, Singapore’s Personal Data Protection Commission issued its grounds of decision against Singapore Health Services Pte. Ltd. (SingHealth) and Integrated Health Information Systems Pte. Ltd. (IHiS) for what has been coined the “worst breach of personal data in Singapore’s history”.

The unprecedented cyber attack on SingHealth’s patient database system led to the exfiltration of 1.5 million patients’ personal data and nearly 160,000 patients’ outpatient prescription records.

The commission received several complaints from members of the public regarding this data breach and commenced its investigations thereafter.

Continue Reading

HM Treasury inquiry into IT failures in the financial services sector

At the end of 2018 the UK Treasury Committee announced that it would launch an inquiry into information technology (IT) failures in the financial services sector. The Treasury Committee has stated that it will appoint a specialist advisor to help provide analysis and aid the inquiry.

The past 18 months have seen numerous IT failures in the financial services sector. Equifax, Barclays and TSB have all suffered incidents, to name a few. TSB is arguably the highest profile case, when 1.9 million customers were logged out of their online banking accounts for up to a month and with some customers also claiming to have been able to view other customers’ bank details. This occurred after the bank attempted to migrate customer information from its former owner to current owner Banco Sabadell.

The inquiry by the Treasury Committee is set to explore the common causes of such operational incidents, to better understand what consumers have lost as a result of the failures, and also to determine whether regulators such as the Bank of England Prudential Regulation Authority and the Financial Conduct Authority have the necessary ability and power to hold firms involved to account. Continue Reading

Draft ethics guidelines for trustworthy artificial intelligence published by the European Commission

On 18 December 2018, the European Commission published draft ethics guidelines for trustworthy AI. The guidelines are voluntary and constitute a working document to be updated over time. The guidelines have been opened up to a stakeholder consultation process.

The guidelines recognise that there are benefits to be gained from AI, but that humankind can only reap the benefits if we can trust the technology (in other words, that the technology contains trustworthy AI). An overarching principle in the guidelines is that AI should be human-centric, with the aim of increasing human well-being.

Trustworthy AI is defined as having two components:

  1. respect for fundamental rights, ethical principles and societal values – an “ethical purpose”, and
  2. be technically robust and reliable.

The guidelines set out a framework for implementing and operating trustworthy AI, aimed at stakeholders who develop, deploy or use AI.

Continue Reading

Data brokers begin 2019 with new Vermont law

A new Vermont law enforcing data security and annual disclosure obligations on data brokerage companies (e.g., Acxiom, Experian, Epsilon) came into effect on January 1, 2019.  Data brokers are required to register annually with the Vermont Attorney General and pay an annual registration fee.  Annually, data brokers must release information regarding practices related to the collection, storage and sale of personal information, applicable opt-out practices (if any) and the number of data breaches experienced during the prior year along with the total number of consumers affected by such breaches (if known) to the State Attorney General.  If brokers do not comply with the new laws, they could be considered in violation of Vermont’s consumer protection law.  To read more on the new law, visit our AdLaw By Request blog.

Brexit countdown: UK government to amend domestic data protection legislation

The Data Protection, Privacy and Electronic Communications (Amendments etc.) (EU Exit) Regulations 2019 have been laid before the UK Parliament.

The regulations are introduced under the European Union (Withdrawal) Act 2018. The Withdrawal Act grants powers to correct deficiencies in UK legislation that will arise as a result of Brexit.

The regulations introduce a large number of technical amendments to UK law. The main amendments are made to:

  1. the General Data Protection Regulation 2016/679 (GDPR) as retained by UK law;
  2. the Data Protection Act 2018 (DPA 2018); and
  3. The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).

When the United Kingdom (UK) leaves the European Union (EU), the UK will no longer be subject to obligations under GDPR (except for processing still caught by the GDPR’s extra-territorial scope). However, the Withdrawal Act provides that the text of the GDPR will form part of UK domestic law after Brexit (UK GDPR). As a result, the text of UK GDPR must be amended to remedy potential deficiencies for when the UK is no longer part of the EU. The text of the DPA 2018 must also be amended to implement UK GDPR.

Continue Reading

First two Singapore data protection enforcement decisions issued in 2019

On January 3, 2019, Singapore’s Personal Data Protection Commission issued two grounds of decision against Bud Cosmetics and AIG Asia Pacific Insurance Pte Ltd & Toppan Forms (S) Pte Ltd.

Bud Cosmetics

The facts of this case were as follows:

  • Bud Cosmetics is an organic and natural skincare retailer with retail outlets in Singapore and an online store.
  • It collected customer information for membership registration and maintained two separate databases: one for online registrations and another for registrations in person at its retail outlets.
  • As part of its marketing activities, Bud Cosmetics sent its customers e-newsletters with its latest promotional offers and products. Such e-newsletters were generated by selecting members’ email addresses from both online and offline databases based on certain criteria. After an e-newsletter was sent out, the customer mailing list for that particular e-newsletter would be stored in an archive folder.
  • An individual complainant discovered a URL link to a member list of Bud Cosmetics’ when she conducted a search using her name on the Internet. The list contained the names, dates of birth, contact numbers, email addresses and residential addresses of approximately 2,300 persons.
  • The member list was located in the image folder for an e-newsletter that was sent out in 2012 and hosted on a third-party server based in Australia. This system was hacked in April 2012. Bud Cosmetics switched web hosting companies in 2013, and engaged a U.S. entity with servers located in Provo, Utah.

Continue Reading

Social plug-ins – Advocate General issues opinion on joint controllership case

On 19 December 2018, the Advocate General (AG) delivered an opinion in a case concerning Fashion ID and Facebook, which considered the parties’ status as joint controllers, under the Data Protection Directive 95/46/EC (DP Directive), when a social plug-in had been embedded.

Fashion ID’s website inserted Facebook’s ‘Like’ button as a plug-in, allowing personal data, such as the user’s IP address and browser journey, to be transferred to Facebook regardless of whether the user clicked on the Facebook Like button. A consumer protection association brought a claim against Fashion ID, arguing that the use of the Facebook Like button was a breach of data protection laws.

The AG’s opinion focuses on four main areas. The first proposal within that opinion is that the DP Directive did not preclude national legislation granting standing to public service associations for them to protect consumers. The remaining three proposals are discussed further below.

Continue Reading

LexBlog