In 2012, Facebook Altered Content To Tweak Readers' Emotions 130
The Atlantic reports that two years ago, Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone, negative or positive, and observe the results. From the Atlantic article: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
At least they showed their work.
Why I don't have a Facebook account (Score:5, Insightful)
This sort of thing is exactly why I have never signed up for an account. The lack of a moral compass at this company is profound.
Ethical Responsibility (Score:5, Insightful)
Re:consent (Score:5, Insightful)
[citation needed]. Almost every major website does A/B_testing [wikipedia.org]. Is there a law againt this? (That's not a rethorical question. I actually would like to know.)
That's pretty damn vague, son. (Score:1, Insightful)
What exactly is considered a "psychological experiment"? Your definition seems very vague, and implies that any sort of software usability testing or change that involves offering different experiences to different users should be outlawed or very strictly controlled.
Take Mozilla Firefox as a recent example. Firefox 28 had a shitty, but at least partially usable user interface. Then Mozilla released Firefox 29, which brought in the Australis user interface, which is indisputably a pretty much unusable pile of feces. The psychological impacts of these changes are profound. Users who use Firefox 28 tend to be agitated due to its bad UI. But users who use Firefox 29 or later are often foaming at the mouth with outright anger over the horrible experience they're being subjected to.
Under your definition, it would be "wrong" to compare the user experience of those using Firefox 28 versus those using Firefox 29 or later without having them sign a bunch of paperwork.
Re:Why I don't have a Facebook account (Score:5, Insightful)
it seems its pretty much the same for every other large US company too.
Re:consent (Score:3, Insightful)
I don't think this is about whether Facebook had a legal right to do this, but more on that in a minute. It's more about whether it was ethical on their part. Regardless, I think it clearly was not ethical for the researchers to do this study without getting the approval of the users who took part in the study.
Getting back to the legal issue, every site or app has a Terms of Services agreement. Does FB's TOS say that you might be randomly placed in a A/B test used for academic research purposes? If they don't, it seems to me that could be a legal issue.
Re:consent (Score:2, Insightful)
"There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting..."
They can do whatever they want, it's their site. They decide what to show you, if they show you something and when.
And they have the right to sell that information about you at any price they choose.
It's their business.
You are the product, the psychologists are the customer.
Re:consent (Score:4, Insightful)
They can do whatever they want, it's their site.
Did you think about that before you wrote it? If not, take a second and think about it.
There are many, many, many things they cannot do with their site.
Re:consent (Score:3, Insightful)