Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Social Networks Facebook

In 2012, Facebook Altered Content To Tweak Readers' Emotions 130

The Atlantic reports that two years ago, Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone, negative or positive, and observe the results. From the Atlantic article: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. At least they showed their work.
This discussion has been archived. No new comments can be posted.

In 2012, Facebook Altered Content To Tweak Readers' Emotions

Comments Filter:
  • consent (Score:5, Interesting)

    by sribe ( 304414 ) on Sunday June 29, 2014 @08:46AM (#47344095)

    There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Sunday June 29, 2014 @08:58AM (#47344129)

    I see it in my self, on the rare occasions that I actually post, which is roughly 5-10 times a year and I see it with others whenever I go online to browse a little in the posts of the people I'm connected with ... called "Friends" (Fingerquotes!) on FB:

    Facebook and other "social networks" encourage posing. No two ways about it.

    If you get all worked up and batter your self esteem just because somebody posted himself in cool poses or on some event that you "missed out" on ... I get this a lot, since I'm only on FB for my tango dancing connections, a pastime where posing sometimes actually is part of the game. Actually knowing the person behind a neat facade on FB does put things into perspective.

    Bottom line:
    People shouldn't get more attached to these things than it is good for them. If this neat little stund by FB shows them that, then all the better.

    My 2 cents.

  • Filter bubble (Score:5, Interesting)

    by drolli ( 522659 ) on Sunday June 29, 2014 @09:11AM (#47344179) Journal

    What actually disturbs me more is: why should they do this? The answer is simple: They want to determine the most effective non-obvious way of creating filter bubbles to make the user feel well and stay longer.

    It is so-to say a "second order filter bubble", i.e. the use of a positive feedback mechanism.

  • by langelgjm ( 860756 ) on Sunday June 29, 2014 @09:21AM (#47344225) Journal

    It's called the Common Rule [hhs.gov], although it generally only applies to federally funded research. There is some evidence [cornell.edu] that this study was in part federally funded. I think there are serious questions about whether a click-through agreement meets the standards of informed consent.

    Although the study was approved by an institutional review board, I'm surprised, and the comment from the Princeton editor makes me wonder how well they understood the research design (or how clearly it was explained to them). This would never have gotten past my IRB.

  • Re:consent (Score:2, Interesting)

    by Smallpond ( 221300 ) on Sunday June 29, 2014 @10:06AM (#47344351) Homepage Journal

    There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...

    There is no such law. In any case, this is the the basis for the entire news business. Why do they report murders but not acts of charity (unless a celebrity is involved)? It is all about getting you to watch and see ads. Facebook is doing nothing that TV news hasn't been doing for years.

  • Re:consent (Score:3, Interesting)

    by by (1706743) ( 1706744 ) on Sunday June 29, 2014 @10:07AM (#47344357)
    But surely users are allowed to be put in an A/B test used for *commercial/advertisement* purposes, right? Is doing something for academic purposes somehow worse than for business purposes? Personally, I would rather my online behavior be used for a purpose which nominally increases our knowledge than for a purpose which increases someone's bottom line.

    That said, I do find this whole thing to be a little shady...but I'm not sure it's a particularly rational reaction, given that I rarely care about A/B testing when it's being used to shamelessly make money off of me...
  • Re:consent (Score:2, Interesting)

    by Anonymous Coward on Sunday June 29, 2014 @10:46AM (#47344497)
    The researchers were trying to incite negative emotions in the subjects. That's unethical if the people don't consent. You're playing with people's lives beyond Facebook. Follow ET's rule: Beeee Gooood.
  • by nickmalthus ( 972450 ) on Sunday June 29, 2014 @11:17AM (#47344655)
    Secret psychological tests on population in a mass? Edward Bernays [wikipedia.org] would have been elated to have this capability in his time.

    In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons...who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind.

  • Re:consent (Score:5, Interesting)

    by Ceriel Nosforit ( 682174 ) on Sunday June 29, 2014 @11:26AM (#47344697)

    There are laws against assault, bullying, and so on. The positive spin in innocuous but the negative spin is not.

    With 700 000 potential victims, the numbers are against them because when your sample size is that large outliers are the rule and not the exception.

    The risk of copycat suicide for example should have been obvious to those conducting this study.

  • by QilessQi ( 2044624 ) on Sunday June 29, 2014 @11:45AM (#47344809)

    Hey, Facebook! Can you help me with some experiments of my own? I'd like to see the outcome if...

    1. Right before an election, all posts favoring candidate X or the political views of party X were promoted to the top of everyone's feed and given 20 extra fake "Likes", while posts favoring the opposition are demoted and de-liked.

    2. Phrases in posts favoring candidate X or the political views of party X are subtly "edited" when the appear in everyone else's news feed to be more positive (e.g., "like" to "love", "good" to "great"), while phrases in posts favoring the opposition are given the reverse treatment and sprinkled with misspellings.

    3. FB users with a tendency of opposition to party X have random fake posts/comments from them appear in other's feeds only, in which they insult their friends' baby pictures, make tasteless jokes, and vaguely threaten supporters of party X to "unfriend me if ur so lame u can't take the TRUTH, lol".

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...