Facebook is being investigated by the Information Commissioner’s Office in the UK after a study showed a psychological experiment influenced what users saw in their news feeds, raising fresh privacy concerns.
A company researcher apologized on June 29 for a test in January 2012 that altered the number of positive and negative comments that almost 700,000 users saw on their online feeds of articles and photos.
Disclosure of the experiment prompted some members to express outrage on Twitter about the research as a breach of privacy.
Facebook “communicated poorly” about the experiment, Chief Operating Officer Sheryl Sandberg said today at a New Delhi event to promote her book “Lean In: Women, Work and the Will to Lead.”
No informed consent
A number of other researchers have condemned Facebook’s experiment, saying it breaches ethical guidelines for “informed consent”.
James Grimmelmann, professor of law at the University of Maryland, points in an extensive blog post that “Facebook didn’t give users informed consent” to allow them to decide whether to take part in the study, under US human subjects research. “The study harmed participants,” because it changed their mood, Grimmelmann comments, adding “This is bad, even for Facebook.”
A spokesman for the ICO said yesterday that the agency would be speaking with Facebook and working with the Irish Data Protection Commissioner to learn more about the circumstances.
The ICO is investigating whether the company broke data- protection laws, though it’s too early to tell what part of the law Facebook may have infringed.
The Irish Data Protection Commissioner’s office has been in contact with Facebook on privacy issues, including consent in relation to the research, and is awaiting a full report from the company, said John O’Dwyer, a spokesman for the agency. Facebook’s compliance with European Union law is governed by Ireland, because its European headquarters are in Dublin.
“It’s clear that people were upset by this study and we take responsibility for it,” said Richard Allan, a spokesman for Facebook in the U.K., in an e-mailed statement. “We want to do better in the future and are improving our process based on this feedback.
The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have.”
Findings of the study
According to a study published June 17 in the Proceedings of the National Academy of Sciences, the number of positive and negative comments that users saw on their news feeds was changed in January 2012. People shown fewer positive words were found to write more negative posts, while the reverse happened with those exposed to fewer negative terms, according to the trial of random Facebook users.
The data showed that online messages influence readers’ “experience of emotions,” which may affect offline behaviour, the researchers said.
Analysis: ‘One of many consumer experiments’
Suzy Moat Assistant Professor of Behavioural Science at Warwick Business School, said: “In many ways, this experiment is simply a public example of the experiments that many businesses run on a regular basis to investigate how they can influence our behaviour.
“For example, Facebook and Amazon constantly experiment with showing different groups of people slightly different versions of their websites to see if one version is better at increasing the frequency with which users engage with the content, click on adverts, or buy products. You can opt out of these ‘experiments’ by not using the websites – but many people don’t want to do this, as the services offer such value to them.
“So it’s interesting that there’s such outrage about this, but not about the experiments which many online businesses run on all of us on a day-to-day basis – possibly because a lot of people simply don’t realise that they’re happening. However, scientific experiments are generally supposed to be run for public good, not for business interests, and it’s obvious that many people currently feel that this experiment was not in their best interest.
“On one hand, the experiment has already done a lot of public good, by clearly demonstrating how small changes on a widely used service can affect the behaviour of large numbers of people – something which many people may not have previously realised. Crucial to this realisation is the fact that the methods and results have been made publicly available, in contrast to experiments run for business purposes.
“On the other hand, it’s extremely understandable that many people are upset that their behaviour may have been manipulated for purely scientific purposes without their consent. In particular, Facebook’s user base is so wide that everyone wonders if they were in the experiment.
“This highlights a need for scientists to think back to how experiments in the social sciences have traditionally been run, and consider how open and transparent ethics procedures can be redesigned to take new technology such as Facebook into account. We also need to look at existing successful online experiments, such as Mappiness, which have recruited extremely large numbers of informed participants, simply by offering sufficient benefits for participating – just like a business does.”