CIVIL RIGHTS GROUP the Electronic Privacy Information Centre (EPIC) has lodged a complaint with the US Federal Trade Commission (FTC) following the recent revelations about a mood experiment by Facebook.
EPIC, which has history with the social network, wants the FTC to consider how the study, which involved no user consent, fits into the 20-year privacy commitment that Facebook committed to with the FTC.
"The company purposefully messed with peoples' minds," EPIC said as it introduced its complaint.
"At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes. Facebook also failed to inform users that their personal information would be shared with researchers," it added in its complaint (PDF).
"Moreover, at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users' affirmative express consent prior to sharing user information with third parties."
It said that the study is a 'deceptive practice subject' that warrants FTC attention.
Earlier this week Fight for the Future launched a petition in a bid to stop Facebook from experimenting on its users.
"Facebook made thousands of people sad on purpose while running a test on users without their consent," Fight for the Future writes on its website, and is calling for users to "tell Facebook to stop experimenting with its users".
As well as calling for Facebook to stop turning its users into guinea pigs, Fight for the Future is asking the social network to inform those affected by the expeiment, to to disclose details regarding any other similar experiments that have been conducted.
Facebook also faces a probe from the UK's Information Commissioners Office (ICO) over its controversial user experiment, with the watchdog set to investigate whether the social network broke the law.
The Financial Times has heard from the ICO that it will examine Facebook's user experiment, which saw it manipulating News Feeds, to see whether the firm has broken the law. An ICO spokesperson told the newspaper that it was "too early to tell exactly what part of the law Facebook may have infringed".
It seems Facebook's user study could get it into a fair bit of bother, as on Tuesday it was revealed that the firm added a "research" clause to its terms and conditions (T&Cs) four months after it began manipulating what users saw on the social network.
So said Forbes, which reported that four months after the contriversial study took place, Facebook added a clause to its T&Cs to cover its tracks. The addition said that the social network could use user information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement".
This move, should Facebook be taken to court about its experiment, could undermine the firm's defence that it had "informed consent" from users to be participants in the research.
However, Facebook has defended this, telling The Guardian, "When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer. To suggest we conducted any corporate research without permission is complete fiction."
Facebook's experiment was uncovered on Monday. A report published in the Proceedings of the National Academy of Sciences (PNAS) journal said that researchers at Cornell University and the University of California purposefully removed positive messages from user feeds to see if that would induce feelings depression among users.
This is not the sort of thing that people like to hear about, and Facebook was quick to offer an explanation. It said that this happened over the course of a week, over two years ago, and claimed that it harmed no one.
"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible," said a spokesperson.
"A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."
The head of the study, Facebook data scientist Adam Kramer, has published an 'apology' post to his own Facebook account, where again we are told that no harm was done and that anything that happened was for the greater good. However, he added that perhaps the study did not go as well as planned. This suggests that the 'good' aspect of the exercise was limited.
"The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," said Kramer.
"I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety." µ
It's time for our regular two-step through the Google news
Bug bounty offer: accepted