In June, Facebook came under public scrutiny after it was revealed that the company carried out research in 2012 that manipulated the News Feeds of 689,000 users. Several regulators are now poised to investigate Facebook’s conduct.
The study exposed users to a large amount of either positive or negative comments in order to observe the effect of this on the way that they used the site. It found that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”
Meanwhile, on 3 July, the Electronic Privacy Information Centre (‘EPIC’) filed a formal complaint with the U.S. Federal Trade Commission, requesting that the regulatory body undertake an investigation of Facebook’s practices. The FTC has not yet responded to this request.
Although perhaps an extreme example, this issue highlights the challenges that organisations can face when using data for a purpose that goes beyond what users would expect. Given the mysterious algorithms that underlie what any Facebook user sees (contrary to common belief, it is not simply a chronological list of activities), it is arguable that the issue here arises out of functionality that is not far removed from Facebook’s everyday operations. It will be interesting therefore to see whether the regulators take any robust action.