Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Social Networks Facebook

In 2012, Facebook Altered Content To Tweak Readers' Emotions 130

The Atlantic reports that two years ago, Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone, negative or positive, and observe the results. From the Atlantic article: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. At least they showed their work.
This discussion has been archived. No new comments can be posted.

In 2012, Facebook Altered Content To Tweak Readers' Emotions

Comments Filter:
  • by sjbe ( 173966 ) on Sunday June 29, 2014 @08:47AM (#47344097)

    This sort of thing is exactly why I have never signed up for an account. The lack of a moral compass at this company is profound.

  • by forand ( 530402 ) on Sunday June 29, 2014 @08:48AM (#47344103) Homepage
    This is quite interesting research that should never have been done. I am rather surprised that the National Academy published the results of a study which violated multiple ethical guidelines put in place to protect human subjects. Did Facebook track the number of suicides in the 700,000 sample? Was the rate of those given a sadder than average stream have a higher or lower rate? Do the Facebook researchers address the ethical questions posed by performing such an experiment at all?
  • Re:consent (Score:5, Insightful)

    by Sqr(twg) ( 2126054 ) on Sunday June 29, 2014 @09:00AM (#47344143)

    [citation needed]. Almost every major website does A/B_testing [wikipedia.org]. Is there a law againt this? (That's not a rethorical question. I actually would like to know.)

  • by Anonymous Coward on Sunday June 29, 2014 @09:06AM (#47344169)

    What exactly is considered a "psychological experiment"? Your definition seems very vague, and implies that any sort of software usability testing or change that involves offering different experiences to different users should be outlawed or very strictly controlled.

    Take Mozilla Firefox as a recent example. Firefox 28 had a shitty, but at least partially usable user interface. Then Mozilla released Firefox 29, which brought in the Australis user interface, which is indisputably a pretty much unusable pile of feces. The psychological impacts of these changes are profound. Users who use Firefox 28 tend to be agitated due to its bad UI. But users who use Firefox 29 or later are often foaming at the mouth with outright anger over the horrible experience they're being subjected to.

    Under your definition, it would be "wrong" to compare the user experience of those using Firefox 28 versus those using Firefox 29 or later without having them sign a bunch of paperwork.

  • by JustNiz ( 692889 ) on Sunday June 29, 2014 @09:59AM (#47344325)

    it seems its pretty much the same for every other large US company too.

  • Re:consent (Score:3, Insightful)

    by Anonymous Coward on Sunday June 29, 2014 @10:01AM (#47344337)

    I don't think this is about whether Facebook had a legal right to do this, but more on that in a minute. It's more about whether it was ethical on their part. Regardless, I think it clearly was not ethical for the researchers to do this study without getting the approval of the users who took part in the study.

    Getting back to the legal issue, every site or app has a Terms of Services agreement. Does FB's TOS say that you might be randomly placed in a A/B test used for academic research purposes? If they don't, it seems to me that could be a legal issue.

  • Re:consent (Score:2, Insightful)

    by nospam007 ( 722110 ) * on Sunday June 29, 2014 @10:52AM (#47344527)

    "There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting..."

    They can do whatever they want, it's their site. They decide what to show you, if they show you something and when.
    And they have the right to sell that information about you at any price they choose.
    It's their business.
    You are the product, the psychologists are the customer.

  • Re:consent (Score:4, Insightful)

    by sribe ( 304414 ) on Sunday June 29, 2014 @11:05AM (#47344597)

    They can do whatever they want, it's their site.

    Did you think about that before you wrote it? If not, take a second and think about it.

    There are many, many, many things they cannot do with their site.

  • Re:consent (Score:3, Insightful)

    by wisnoskij ( 1206448 ) on Sunday June 29, 2014 @04:26PM (#47346129) Homepage
    A psychological experiment cannot be called innocuous before the results are in. Who knows, maybe a extremely depressed person is 20 times more likely to commit suicide if they see that the world is 100% perfectly happy and positive.

"Go to Heaven for the climate, Hell for the company." -- Mark Twain