AI Auditing Framework: data protection impact assessment

In March 2019, the Information Commissioner’s Office (ICO) released a Call for Input on developing the ICO’s framework for artificial intelligence (AI). The ICO simultaneously launched its AI Auditing Framework blog to provide updates on the development of the framework and encourage organisations to engage on this topic with the ICO.

On 23 October 2019, the ICO released the latest in its series of blogs (here). The blog outlines the key elements that organisations should focus on when carrying out a Data Protection Impact Assessment (DPIA) for AI systems that process personal data.

We have outlined below some of the key takeaways.

Continue Reading

Bipartisan social media data portability bill introduced in U.S. Senate

Social media users may soon be able to easily transfer their personal information to competing platforms. On October 22, 2019, a bipartisan group of U.S. senators (Mark R. Warner (D-VA), Josh Hawley (R-MO), and Richard Blumenthal (D-CT)) introduced the Augmenting Compatibility and Competition by Enabling Service Switching Act (ACCESS Act), a bill aimed at encouraging market-based competition among today’s major social media platforms by requiring the largest of these tech companies to allow users to move their data from one service to another.

The bill, should it become law, would be regulated and enforced by the Federal Trade Commission (FTC), and would require large communications platforms (products or services with over 100 million monthly active users in the U.S.) to:

  • Make users’ personal data portable, by allowing users to retrieve and/or transfer their personal data in a structure and machine-readable format.
  • Maintain interoperability with other platforms, including competing companies.
  • Give users the ability to designate a trusted third-party service to manage their privacy, content, online interactions, and account settings.

Continue Reading

ICO blogs on AI and data subject rights

On 15 October 2019, the Information Commissioner’s Office (ICO) released the latest in its series of blogs on developing its framework for auditing artificial intelligence (AI). The blog (here) focuses on AI systems and how data subjects can exercise their rights of access, rectification and erasure in relation to such systems. Below, we summarise some of the key takeaways and our thoughts on the subject.

Rights relating to training data

Organisations need data in order to train machine learning models. While it may be difficult to identify the individual to whom the training data relates, it may still be personal data for the purposes of the General Data Protection Regulation (GDPR), and so will still need to be considered when responding to data subject rights requests under the GDPR. Provided no exception applies and reasonable steps have been taken to verify the identity of the data subject, organisations are obliged to respond to data subject access requests in relation to training data. The right of rectification may also apply but, as an individual inaccuracy is less likely to have a direct effect on an individual data subject that is part of a large data set, organisations should prioritise rectifying personal data that may have a direct effect on the individual.

Complying with requests from data subjects to erase training data may prove more challenging. If an organisation no longer needs the personal data as the machine learning model has already been trained, the ICO advises that the organisation must fulfil the request to erase. However, organisations may need to retain training data where the machine learning model has not yet been trained. The ICO advises that organisations should consider such requests on a case-by-case basis, but do not provide clarity on the factors organisations should consider.

Continue Reading

At odds no more: can regulatory collaboration bring innovation and data privacy closer together?

In July 2019, the UK’s Financial Conduct Authority (FCA) held a week-long Global Anti-Money Laundering and Financial Crime TechSprint (FCA TechSprint) event. The FCA TechSprint looked at ways to effectively combat financial crime and money laundering within the financial services industry. On 16 October 2019, the Information Commissioner’s Office (ICO) released a blog (here) that focuses on the lessons learnt from the FCA TechSprint.

Background

The FCA TechSprint brought together teams from all over the world to explore how encryption techniques known as privacy enhancing technologies (PETs) can facilitate data and knowledge sharing among financial institutions, regulators and law enforcement agencies to detect and prevent money laundering and financial crime, while remaining compliant with data protection and privacy laws.

The teams worked towards developing solutions to the following use cases:

  • how can a network of market participants use PETs and data analytics to interrogate financial transactions stored in databases within institutions to identify credible suspicions without compromising data privacy legislation?
  • how can market participants efficiently and effectively codify topologies of crime which can be shared and readily implemented by others in their crime controls?
  • how can a market participant check that the company or individual they are performing due diligence on has not raised flags or concerns within another market participant, and/or verify that the data elements they have for the company or individual match those held by another market participant?
  • how can technology be used to assist in identifying an ultimate beneficiary owner across a network of market participants and a national register?

ICO’s Regulators’ Business Innovation Privacy Hub was present at the FCA TechSprint to offer guidance on the data protection implications of implementing PETs.

Continue Reading

Guidance given on Singapore cross-border data transfer obligation for intermediaries and cloud providers

In Singapore, private sector organisations must generally comply with the transfer limitation obligation in the Personal Data Protection Act (the Act). Any transfer of personal data outside Singapore must be in accordance with the Act’s requirements, to ensure that a comparable standard of protection is accorded to that data.

However, where an organisation is a data intermediary, i.e., it processes personal data on behalf of and for the purposes of another pursuant to a written contract, that intermediary is not subject to the transfer limitation obligation, as specified in section 4(2) of the Act.

Continue Reading

IAB issues CCPA compliance framework for public comment

Given the vast challenges California’s sweeping new privacy law, the California Consumer Privacy Act (CCPA), poses for digital marketing, the Interactive Advertising Bureau (IAB) released for public comment a draft of its proposed Compliance Framework for Publishers & Technology Companies (the Framework) on October 22.

“Selling” and CCPA challenges for digital. Those who have been actively preparing for CCPA’s implementation on January 1 know by now that pursuant to section 1798.115(d) of the CCPA, a company that has personal information about a consumer may not onward “sell” (as defined in the CCPA) such information to another party without the consumer (1) having received explicit notice of the sale of the personal information and (2) being given the right to opt out pursuant to section 1798.120. Under the CCPA, even if consumers opt out of having their personal information sold, the information may be shared with third parties acting as “service providers” for limited purposes, but the party disclosing the personal information (that is, the “business”) is very specifically limited in its ability to use any data it received that is deemed “personal information.”

Current information sharing practices. Currently, in the programmatic advertising ecosystem, publishers may pass personal information about visitors to their website to downstream participants (the Downstream Participants) who then may pass such information on to others in the supply chain. These Downstream Participants include providers such as:

  • supply-side platforms (SSPs)
  • demand-side platforms (DSPs)
  • ad exchanges
  • ad networks
  • ad tech platforms
  • data management platforms (DMPs)

Downstream Participants also include the advertiser who ultimately purchases the ad, funds the ecosystem, and, in many cases, expects to have ready and trusted access to information associated with its advertising activity and consumer behavior in response to such advertising.

Continue Reading

Courts continue to consider intersection of Fourth Amendment and technology: without a warrant, retrieval of car’s electronic data unconstitutional, but surveillance on hunting property permissible

The Fourth Amendment has received significant attention in recent court rulings involving surveillance, electronic data retrieval, and other types of technology. Two rulings issued on October 21, 2019 demonstrate how difficult it can be to anticipate the outcome of Fourth Amendment disputes relating to technology. In one, the Georgia Supreme Court found the warrantless search of electronic data from a car following a fatal accident to be unconstitutional. In the second, the U.S. Court for the Western District of Tennessee held that the Fourth Amendment permitted the warrantless placement of a government surveillance camera on a man’s private hunting and fishing property.

Mobley v. State (Ga. Oct. 21, 2019)

In Mobley, the Georgia Supreme Court ruled that a trial court erred in denying a motion to suppress evidence that law enforcement retrieved from the electronic data recorder in the defendant’s car. In coming to this conclusion, the Mobley court ruled that – regardless of any reasonable expectation of privacy – the physical entry of a police officer into the defendant’s car to retrieve the electronic data was a search for Fourth Amendment purposes.

The Mobley case arose after a car driven by defendant Mobley collided with a car that pulled out of a private driveway; both occupants of the latter car died. Before the cars were removed from the accident scene, a police investigator entered both cars, and attached a crash data retrieval device to data ports in the cars to download available data. The data revealed that shortly before the collision, Mobley’s car was traveling almost 100 miles per hour. The next day, law enforcement applied for a warrant to seize the electronic data recorders. The warrant was issued, but no additional data was retrieved from the recorders. A grand jury indicted Mobley on a number of counts, including vehicular homicide.

Continue Reading

Uncertainty hangs over the life sciences and healthcare industries in draft regulations of The California Consumer Privacy Act

On October 10th, the Attorney General of California, Xavier Becerra, delivered the highly anticipated text of the proposed California Consumer Privacy Act (CCPA) regulations. However, untouched and unexplained were the Health Insurance Portability and Accountability Act, California Medical Information Act, and clinical research exemptions. The industry has and will continue to grapple with these exemptions, which lack crucial definitions and still remain ambiguous.

For further analysis of this matter, please visit Life Sciences Legal Update.

Implications for employers and the biometric landscape under New York’s expanded data security law

Over the past several years, legislators from coast to coast have increasingly made data privacy and cybersecurity top priorities. The result has been a spike in the number and stringency of laws that impose proactive and reactive responsibilities – covering, for instance, data security and breach notifications – on companies that collect personal information, whether from their customers, their employees, end users, or others. That legislative trend has recently expanded previous obligations of companies conducting business in New York state.

On October 23, 2019, New York’s Stop Hacks and Improve Electronic Data Security (SHIELD) Act went into effect.  The law broadens the state’s existing breach notification laws and imposes new security obligations on companies doing business in New York, including an expanded focus on how companies handle biometric data. The SHIELD Act also applies to employee information, as long as there is at least one employee in New York state – regardless of the size or location of the company.  As such, the law will have a significant impact on businesses across the country that have private information about consumers and employees based in New York.

For more information about the ramifications of the SHIELD Act, visit ReedSmith.com.

 

Latin America to bolster data protection in a legal overhaul

The General Data Protection Regulation (GDPR) has prompted a series of legislative proposals in Latin American countries to update data protection regulations, many of which reflect the higher standards of the GDPR. With a large number of European and U.S. companies operating in the region, we look at some of the latest developments below.

Argentina

Argentina was the first Latin American country to implement data protection laws and the first non-European country to be recognised by the European Commission as having adequate levels of data protection. The need to revisit the current legislation is a result of technological advances and the changed international landscape with the introduction of the GDPR since the Argentinian Personal Data Protection Act 2000 came into force.

Argentina’s new draft data protection bill proposes further changes to bring the country’s data protection law in line with the GDPR. The bill acknowledges the right to be forgotten and the right to data portability. Other changes include stricter provisions in the area of cross-border transfers to countries with inadequate levels of data protection, new legal bases for data processing other than data subject consent, including legitimate interests, and new definitions of biometric and genetic data.

Continue Reading

LexBlog