Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Social Networks Facebook Stats

Facebook's Emotion Experiment: Too Far, Or Social Network Norm? 219

Facebook's recently disclosed 2012 experiment in altering the tone of what its users saw in their newsfeeds has brought it plenty of negative opinions to chew on. Here's one, pointed out by an anonymous reader: Facebook's methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. "If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent." For a very different take on the Facebook experiment, consider this defense of it from Tal Yarkoni, who thinks the criticism it's drawn is "misplaced": Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more. ... [H]aranguing Facebook and other companies like it for publicly disclosing scientifically interesting results of experiments that it is already constantly conducting anyway–and that are directly responsible for many of the positive aspects of the user experience–is not likely to accomplish anything useful. If anything, it’ll only ensure that, going forward, all of Facebook’s societally relevant experimental research is done in the dark, where nobody outside the company can ever find out–or complain–about it."
This discussion has been archived. No new comments can be posted.

Facebook's Emotion Experiment: Too Far, Or Social Network Norm?

Comments Filter:
  • more interessting,.. (Score:5, Interesting)

    by Selur ( 2745445 ) on Monday June 30, 2014 @06:38AM (#47348833)

    it doesn't sound like this is the first experiment done by the facebook crowd -> What other experiments happened? Were the participants informed about it later? Who takes the blame if such an experiment results in someone getting hurt?

  • by bickerdyke ( 670000 ) on Monday June 30, 2014 @07:05AM (#47348909)

    I find it pretty shocking that so many people are having difficulty understanding the difference between A/B testing and intentional emotional manipulation where a significant negative (or positive) result was the data point the study strove to measure.

    Creating an emotional response is part of marketing and therefore webdesign.

    Of course you're not directly monitoring emotions as a data point during A/B-Tests. You measure e.g. the clicks, pages read or the time spent on the website. But every marketing guy worth its money could tell you that you can increase all of that by "making the user feel at home".

  • by N1AK ( 864906 ) on Monday June 30, 2014 @07:07AM (#47348915) Homepage
    Where do you draw the line? If Facebook realised that showing more negative stories (by monitoring what people already see) makes people more likely to click adverts is that really any better/worse than them artificially increasing/decreasing the amount of positive stories a user sees?

    If Google was having a hard time deciding if a page was junk or not, would it be unethical to put it in the results for some users and see how they react? Clearly that's an experiment without user knowledge, but it certainly doesn't sound like it's unethical to me and stopping that kind of experimentation or flooding sites with notices about them would make things better for users.

    Obviously there are experiments they could run that would be unethical if users weren't informed and monitored; discussing where the lines are and agreeing some best practices would therefore make sense.
  • by by (1706743) ( 1706744 ) on Monday June 30, 2014 @07:10AM (#47348923)
    According to the WSJ's coverage [] ,

    The impetus for the study was an age-old complaint of some Facebook users: That going on Facebook and seeing all the great and wonderful things other people are doing makes people feel bad about their own lives.

    So although conventional wisdom might say that seeing positive things makes you happier, here there have been accusations to the contrary -- positive things about other people makes you feel lousy about yourself. This study ostensibly looked at that (and I think it found something along the lines of conventional wisdom: happy posts make you post happy stuff, a [dubious!] proxy for your own happines...).

    If Facebook knew (and how would they?) that X makes you depressed, then yes...there might be some moral issues with that. But it seems that Facebook asked a legitimate question -- especially so given that it was published in PNAS.

    That said, feels a little shady. But then, when I log onto Facebook, I am certainly not expecting any aspect of the website to be designed with my best interests in mind!

  • Re:One solution (Score:3, Interesting)

    by flyneye ( 84093 ) on Monday June 30, 2014 @07:29AM (#47348963) Homepage

    But, think of the implications for upcoming elections! How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones? Control! The people obviously need controlled, they don't know what is good for them and the Repubmocrats always will.
    Since there is a market, Facebook, who covers most demographics, can help by raising tensions toward sinister interlopers in our one party political system.
    Look for upcoming "hour of hate" shows profiling Tea Party advocate Emanuel Goldstein, NRA talking heads, Farmers, Anti-war hippies and anyone without the Hillary Clinton seal of approval on their forehead or hand.....

  • by CodyRazor ( 1108681 ) on Monday June 30, 2014 @08:01AM (#47349067) Homepage

    The problem is not that they attempted to create an emotional response or manipulate people's emotions. As people are constantly pointing out advertisers so that all the time. People don't seem to grasp that there is a large difference between this and advertising.

    The problem is the way it was done. People use facebook with the expectation that they are seeing a (reasonably) objective representation of what their friends are trying to express or convey. Facebook is the equivalent of the telephone in a telephone call. If the telephone somehow manipulated what you heard to make your friend sound more negative or positive without changing their core meaning that would be unethical without informed consent, just as this is.

    A more extreme version would be facebook subtly modifying the content of what your friends post as it appears to you without anyone knowing it was doing this. That would be even more unethical. The problem is mirepresentation, the method by which they attempt to manipulate emotions.

  • by bickerdyke ( 670000 ) on Monday June 30, 2014 @08:18AM (#47349121)

    Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line.

    But according to /., not what happend here.

    According to this article here [], no messages were changed:

    Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone

    (emphasis mine).

    I agree with you that changing the actual messages would not be acceptable by any standard.

    Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

    Then you do not understand what is going on. Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable. Screwing around with people's emotions in a controlled experiment should require at minimum review by a genuinely independent ethical review board and probably genuine informed consent. Facebook could be bothered with neither one. They seem to regard their users as insects to be manipulated and dissected.

    And again I agree with you that you're stepping over a line when you're consciously manipulating people's feelings for economic reasons. But this line is crossed thousandfold already. The type of environment is secondary. A/B-tests take place in controlled environments, too.

  • by Anonymous Coward on Monday June 30, 2014 @08:31AM (#47349169)

    I talked to several (non-tech) friends about this, and they were more upset about Facebook "censoring" out posts than the emotional manipulation. In their minds, Facebook allows everything to be shown, but certain topics gain preference due to likes or dislikes. However, they will show you everything if you scroll far enough.

    Their outrage came from the thought that FB was removing "happy" content from their feed. (That it was no longer a "dumb" pipe for social data).

  • Re:A/B-Testing (Score:4, Interesting)

    by Trepidity ( 597 ) <> on Monday June 30, 2014 @08:50AM (#47349261)

    Apparently they did actually get IRB approval, oddly enough. The study was jointly done with two universities, and from what other researchers have told me, the two universities' IRBs approved the protocol. I'm surprised myself that they would. Would be curious to see what their reasoning was.

It is clear that the individual who persecutes a man, his brother, because he is not of the same opinion, is a monster. - Voltaire