Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Social Networks Facebook

In 2012, Facebook Altered Content To Tweak Readers' Emotions 130

The Atlantic reports that two years ago, Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone, negative or positive, and observe the results. From the Atlantic article: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. At least they showed their work.
This discussion has been archived. No new comments can be posted.

In 2012, Facebook Altered Content To Tweak Readers' Emotions

Comments Filter:
  • consent (Score:5, Interesting)

    by sribe ( 304414 ) on Sunday June 29, 2014 @08:46AM (#47344095)

    There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...

    • Re:consent (Score:5, Insightful)

      by Sqr(twg) ( 2126054 ) on Sunday June 29, 2014 @09:00AM (#47344143)

      [citation needed]. Almost every major website does A/B_testing [wikipedia.org]. Is there a law againt this? (That's not a rethorical question. I actually would like to know.)

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        I don't think this is about whether Facebook had a legal right to do this, but more on that in a minute. It's more about whether it was ethical on their part. Regardless, I think it clearly was not ethical for the researchers to do this study without getting the approval of the users who took part in the study.

        Getting back to the legal issue, every site or app has a Terms of Services agreement. Does FB's TOS say that you might be randomly placed in a A/B test used for academic research purposes? If they don

        • Re: (Score:3, Interesting)

          But surely users are allowed to be put in an A/B test used for *commercial/advertisement* purposes, right? Is doing something for academic purposes somehow worse than for business purposes? Personally, I would rather my online behavior be used for a purpose which nominally increases our knowledge than for a purpose which increases someone's bottom line.

          That said, I do find this whole thing to be a little shady...but I'm not sure it's a particularly rational reaction, given that I rarely care about A/B te
        • If they don't, it seems to me that could be a legal issue.

          I doubt it. First, there is no "law" requiring informed consent for such an innocuous study, just ethical guidelines. Second, there is no law saying that entities can only do things positively asserted in the TOS. Companies do behavior research all the time. Grocery stores experiment with different product placement, different background music, different lighting. They are not expected to get consent for that. This is no different. I am feeling a distinct lack of outrage about this.

          • Re:consent (Score:5, Interesting)

            by Ceriel Nosforit ( 682174 ) on Sunday June 29, 2014 @11:26AM (#47344697)

            There are laws against assault, bullying, and so on. The positive spin in innocuous but the negative spin is not.

            With 700 000 potential victims, the numbers are against them because when your sample size is that large outliers are the rule and not the exception.

            The risk of copycat suicide for example should have been obvious to those conducting this study.

            • by ClioCJS ( 264898 )
              And anyone who wants to blame someone's suicide on the content of that person's facebook feed... is the type of person who blames a death on a gun, instead of the real cause.
            • Re: (Score:3, Insightful)

              by wisnoskij ( 1206448 )
              A psychological experiment cannot be called innocuous before the results are in. Who knows, maybe a extremely depressed person is 20 times more likely to commit suicide if they see that the world is 100% perfectly happy and positive.
              • I was addressing whether or not there could be legal issues, so the subject of intent becomes very relevant.
                The intent, for half the people affected, was to cause harm.

                • No it was not. It was an experiment, they might of had some idea of what they thought was going to happen, but the entire point was to try and see. They were hoping that something would happen, but the more unexpected and interesting the result the better, really.
                  • In conducting the experiment, they among other things caused harm through inducing depression which persisted even after the experiment. Causing harm was part of the intent even if not the ultimate intent.

                    Your argument was: the means justify the end.

          • by rtb61 ( 674572 )

            Apparently the psychologist involved thought is was OK, because as it is part of normal default policy, âoeFacebook apparently manipulates peopleâ(TM)s News Feeds all of the timeâ. https://theconversation.com/sh... [theconversation.com].

            SHIT THAT'S A REAL SHOCKER deserving of capitals.

        • by EvilSS ( 557649 )

          Does FB's TOS say that you might be randomly placed in a A/B test used for academic research purposes? .

          Actually, it does.

      • Re: (Score:2, Interesting)

        by Anonymous Coward
        The researchers were trying to incite negative emotions in the subjects. That's unethical if the people don't consent. You're playing with people's lives beyond Facebook. Follow ET's rule: Beeee Gooood.
      • Re:consent (Score:4, Informative)

        by Jmstuckman ( 561420 ) on Sunday June 29, 2014 @11:16AM (#47344649) Journal

        From a legal standpoint, for an activity to be considered "research", it must be "designed to develop or contribute to generalizable knowledge". http://www.virginia.edu/vpr/ir... [virginia.edu]

        When a website uses A/B testing to improve its own internal operations, it's seeking to privately develop limited knowledge on its own operations, rather than general knowledge. This puts it outside the scope of US federal regulations on research, which have been narrowly crafted to avoid regulating commercial activities like these.

        Given these criteria, Facebook was surely engaged in research.

        • I really doubt Facebook conducted this experiment in order to "develop or contribute to generalizable knowledge" out of the kindness of their hearts. Far more likely, they seek to profit from it by generating more posts, more traffic, etc.

          This is no different from retail stores conducting studies linking customer spending habits to item locations; they do such studies all the time.

      • Re:consent (Score:5, Informative)

        by Mashiki ( 184564 ) <mashiki&gmail,com> on Sunday June 29, 2014 @05:22PM (#47346323) Homepage

        Yes there is laws against this. Anyone who lives in Canada, and is a part of the experiment but did not receive informed consent may contact Health Canada/federal crown about it. It's illegal here. [hc-sc.gc.ca]

      • what if they modified "the" mix ofcontent relative to political opinons as an "experiment" just arund election time? It's still a testing. facebook? politics? ....oh wait! forget about that: It's an oxymoron.
    • Just think of it as the new Facebook beta.

    • by Anonymous Coward

      What exactly is considered a "psychological experiment"? Your definition seems very vague, and implies that any sort of software usability testing or change that involves offering different experiences to different users should be outlawed or very strictly controlled.

      Take Mozilla Firefox as a recent example. Firefox 28 had a shitty, but at least partially usable user interface. Then Mozilla released Firefox 29, which brought in the Australis user interface, which is indisputably a pretty much unusable pile

      • by Anonymous Coward

        Well, I dunno, but I think being published in PNAS probably points towards it being considered a psychological experiment and not software usability testing.

      • by rossdee ( 243626 )

        Most of this 'free' software has a "do you accept the terms and conditions" clause that you have to click "I agree" in order to install or run the software.
        Now the enforcability of such agreements may be open to dispute, especcially if it entailled some sacrifice on the users part (you agree to give up your firstborn child to us..) but it usually would cover "we can change the user interface at any time, and don't complain if memory leaks cause your system to crash eventually)

    • Filter bubble (Score:5, Interesting)

      by drolli ( 522659 ) on Sunday June 29, 2014 @09:11AM (#47344179) Journal

      What actually disturbs me more is: why should they do this? The answer is simple: They want to determine the most effective non-obvious way of creating filter bubbles to make the user feel well and stay longer.

      It is so-to say a "second order filter bubble", i.e. the use of a positive feedback mechanism.

      • They want to determine the most effective non-obvious way of creating filter bubbles to make the user feel well and stay longer.

        Well, they want the user to stay longer. But making them feel well wouldn't actually do that. Depressed people use the internet more. If it makes you depressed, then bingo! Ad revenue!

      • Rather than whine about being a guinea pig you could just not play their game. Every hour spent on Facebook is an hour of your life lost to more productive or enriching pursuits.

    • by langelgjm ( 860756 ) on Sunday June 29, 2014 @09:21AM (#47344225) Journal

      It's called the Common Rule [hhs.gov], although it generally only applies to federally funded research. There is some evidence [cornell.edu] that this study was in part federally funded. I think there are serious questions about whether a click-through agreement meets the standards of informed consent.

      Although the study was approved by an institutional review board, I'm surprised, and the comment from the Princeton editor makes me wonder how well they understood the research design (or how clearly it was explained to them). This would never have gotten past my IRB.

    • Re: (Score:2, Interesting)

      by Smallpond ( 221300 )

      There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...

      There is no such law. In any case, this is the the basis for the entire news business. Why do they report murders but not acts of charity (unless a celebrity is involved)? It is all about getting you to watch and see ads. Facebook is doing nothing that TV news hasn't been doing for years.

      • There is no such law. In any case, this is the the basis for the entire news business.

        While it may not be an actual law, there are strict rules [hhs.gov] about this for any study, like this one, that receives US federal funding.

    • There are laws governing obtaining informed consent from humans before performing psychological experiments on them.

      That only applies to federally funded research (which means almost all colleges and universities). Attempting to apply this to the private sector would raise serious First Amendment questions. What one person calls "psychological experiments", another might call "protected free speech".

      • by sribe ( 304414 )

        That only applies to federally funded research (which means almost all colleges and universities). Attempting to apply this to the private sector would raise serious First Amendment questions. What one person calls "psychological experiments", another might call "protected free speech".

        This study appears to have taken federal funds...

    • Ethics board must be a bunch of chimps. It's not Little Albert, but then again, it's on such a massive scale with 700k users. What if someone committed suicide because this manipulation?

    • Re: (Score:2, Insightful)

      by nospam007 ( 722110 ) *

      "There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting..."

      They can do whatever they want, it's their site. They decide what to show you, if they show you something and when.
      And they have the right to sell that information about you at any price they choose.
      It's their business.
      You are the product, the psychologists are the customer.

      • Re:consent (Score:4, Insightful)

        by sribe ( 304414 ) on Sunday June 29, 2014 @11:05AM (#47344597)

        They can do whatever they want, it's their site.

        Did you think about that before you wrote it? If not, take a second and think about it.

        There are many, many, many things they cannot do with their site.

        • by Nkwe ( 604125 )

          They can do whatever they want, it's their site.

          Did you think about that before you wrote it? If not, take a second and think about it.

          There are many, many, many things they cannot do with their site.

          Within technical limitations, they can do anything they want with their site. However, some things they could do may have legal or financial consequences.

    • by dbc ( 135354 )

      That's the first thing that popped into my mind. After having spent many hours over the past week helping my daughter do paperwork so that she could submit her extremely benign science fair project to the county science fair's institutional review board, I'm wondering how FB can get way with this? I guess that they can get away with it because no one will call them out on it, unless some victims file a lawsuit.

      That's the modern world -- a 15 year old kid doing something demonstrably harmless has to do hou

    • It's also not just a legal matter. Performing experiments on humans without their consent is immoral.

      • It's also not just a legal matter. Performing experiments on humans without their consent is immoral.

        So is it immoral when walmart puts poptarts on the endcap or different colored bubblegum in the impulse aisle?
        They are trying to see if they can manipulate you into buying more product. Stores even experiment in different colors.
        Some stores want you to stay longer so they use happy colors. Other stores want you to leave quickly to make room
        for the next customer so they use sad colors. They are altering the moods of their customers too and again, without
        consent. I don't know what the threshold should be

    • by Meski ( 774546 )
      If you know you're being tested, that could invalidate the results. Sure that's cold, but it's also reality.
  • by sjbe ( 173966 ) on Sunday June 29, 2014 @08:47AM (#47344097)

    This sort of thing is exactly why I have never signed up for an account. The lack of a moral compass at this company is profound.

    • by JustNiz ( 692889 ) on Sunday June 29, 2014 @09:59AM (#47344325)

      it seems its pretty much the same for every other large US company too.

    • by SeaFox ( 739806 )

      It's been easy to think of people who got involved in Facebook as lemmings, apparently guinea pig was the more apt mammal to choose.

  • by forand ( 530402 ) on Sunday June 29, 2014 @08:48AM (#47344103) Homepage
    This is quite interesting research that should never have been done. I am rather surprised that the National Academy published the results of a study which violated multiple ethical guidelines put in place to protect human subjects. Did Facebook track the number of suicides in the 700,000 sample? Was the rate of those given a sadder than average stream have a higher or lower rate? Do the Facebook researchers address the ethical questions posed by performing such an experiment at all?
    • Regardless of whether you're right or wrong, you're already too late [wikipedia.org].
    • to protect human subjects

      Oh do stop being so precious. It's no different from an individual posting a sad or depressed piece, themselves. Should they then be sued, arrested or punished for the "emotional damage" they cause to anyone who reads it?

    • I am rather surprised that the National Academy published the results of a study which violated multiple ethical guidelines put in place to protect human subjects.

      The only real point of being accepted to the National Academy is access to Proceedings of the National Academy of Sciences. They don't turn down anything from members.

      For NA members, it's where you publish when nobody else will accept your paper.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Sunday June 29, 2014 @08:58AM (#47344129)

    I see it in my self, on the rare occasions that I actually post, which is roughly 5-10 times a year and I see it with others whenever I go online to browse a little in the posts of the people I'm connected with ... called "Friends" (Fingerquotes!) on FB:

    Facebook and other "social networks" encourage posing. No two ways about it.

    If you get all worked up and batter your self esteem just because somebody posted himself in cool poses or on some event that you "missed out" on ... I get this a lot, since I'm only on FB for my tango dancing connections, a pastime where posing sometimes actually is part of the game. Actually knowing the person behind a neat facade on FB does put things into perspective.

    Bottom line:
    People shouldn't get more attached to these things than it is good for them. If this neat little stund by FB shows them that, then all the better.

    My 2 cents.

    • this is junk science...not informed consent (buried in a EULA is not informed consent)

      first and foremost, TFA description is wrong....TFA and the link research did NOT list what words make a post "positive" and "negative"

      we cannot check their work by examining what factors they chose to represent the experimental variables

      2nd, you're absolutely right that what people post to facebook.com is often not an accurate reflection of their current mood or actions

      my only caveat is that some facebook users really don

    • by HnT ( 306652 )

      Not only do they facilitate and encourage posing, the whole reason for "web 2.0" and "social networks" is absolutely nothing but posing on a global scale.

  • by angularbanjo ( 1521611 ) on Sunday June 29, 2014 @09:07AM (#47344171)
    ... that's pretty shocking.
  • In ___a year___, ___a company___ Altered Content To Tweak Readers' Emotions Welcome to the dark, twisted, conspiracy-laden world of marketing.
  • So basically all they've done is tell us that people respond to their surroundings. Okay, nothing new there. What would be interesting is if FB could somehow start quantifying the level of the reaction. Then, after a few hundred years of study we might start to get the glimmerings of a science.
  • ... our stock is very happy ... buy our stock ... you like using Facebook ... you are happy when you use FaceBook ... buy our stock ...

  • I have had my own posts removed from my newsfeed. That should be the bigger issue - I really doubt I've been singled out. (looks out window for black heli..._)

    • Back when I used fb what I liked most was how you'd have a page with something actually important that loaded in a snap and fb would "fail" to thumbnail it with a nondescriptive error, but if you loaded some page full of bloatshit about something stupid it would thumbnail quickly and show right up so that you could share the stupidity to your heart's content, even if it was old now-non-news.

      FB has always shown people only a subset of your posts, even if they explicitly ask to see all of them. That's a big p

      • They eventually revealed the reason they only show your content to a subset of your followers:

        So they could charge you to reach more of them. Seriously. You can pay to "promote" your posts, and all that does is increase the reach within the people that have explicitly indicated interest in your content.

        • Someone I know tried that once on a lark, to see if it works. It doesn't.

        • You misread my post. I have had my posts removed from my newsfeed. Not talking about friends, etc. If I view my newsfeed sorted by "most recent", fb has removed content. If I post about puppies, shows up fine. So again, not only is fb denying my content to others, its mechanisms are denying it to me as well unless I view directly on my home page.

  • by nickmalthus ( 972450 ) on Sunday June 29, 2014 @11:17AM (#47344655)
    Secret psychological tests on population in a mass? Edward Bernays [wikipedia.org] would have been elated to have this capability in his time.

    In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons...who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind.

  • by QilessQi ( 2044624 ) on Sunday June 29, 2014 @11:45AM (#47344809)

    Hey, Facebook! Can you help me with some experiments of my own? I'd like to see the outcome if...

    1. Right before an election, all posts favoring candidate X or the political views of party X were promoted to the top of everyone's feed and given 20 extra fake "Likes", while posts favoring the opposition are demoted and de-liked.

    2. Phrases in posts favoring candidate X or the political views of party X are subtly "edited" when the appear in everyone else's news feed to be more positive (e.g., "like" to "love", "good" to "great"), while phrases in posts favoring the opposition are given the reverse treatment and sprinkled with misspellings.

    3. FB users with a tendency of opposition to party X have random fake posts/comments from them appear in other's feeds only, in which they insult their friends' baby pictures, make tasteless jokes, and vaguely threaten supporters of party X to "unfriend me if ur so lame u can't take the TRUTH, lol".

    • Ok, guys, I swear that there were no misspellings when I typed "when they appear" in #2 above. IT HAS BEGUN.

  • Is /. doing the same thing?
  • Facebook users gave up their privacy and allow their personal data to be mined. Posts have been used against them by employers, criminals, government agencies, various companies and Facebook. Facebook sells your data to advertisers and other organisations. This really comes as a surprise to anyone?

    What Facebook has shown is that they can easily manipulate their users in a predicable manner. In this case it was for a study but is there anything stopping them doing something like this as a service to adve

  • The research is (partly at least) army funded. That does explain why every academic ethic rule is ignored. Cornell has co-authored this research, so they can know. Check the last couples of lines to see for yourself. That part makes this even more disturbing. The media should include this 'small' detail. http://www.news.cornell.edu/st... [cornell.edu]
    • Re: (Score:3, Informative)

      by geggo98 ( 835161 )
      Hm, the last paragraph says:

      Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.

      • by santax ( 1541065 )
        Yeah they changed that after the outrage.... Here is a screenie from before because I had a feeling that the word army would be gone soon when I noticed it. http://tinypic.com/view.php?pi... [tinypic.com] Needless to say that the first version was correct and this is just really bad damage control.
  • There is way too much discussion of Facebook's legal standing here, and if you have ever seen a moot court, you can use legal reasoning and even the body of the law to argue either side. The law is based on competing priorities like politics is, and like economics.

    A more telling result is the impression the disclosure leaves, which is why the story has legs. It is abit like what happened to Donald Sterling; something that seemed OK in one context got leaked into a different context where it appears total

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...