Forgot your password?
typodupeerror
Social Networks Facebook Stats

Facebook's Emotion Experiment: Too Far, Or Social Network Norm? 219

Posted by timothy
from the applied-semantics dept.
Facebook's recently disclosed 2012 experiment in altering the tone of what its users saw in their newsfeeds has brought it plenty of negative opinions to chew on. Here's one, pointed out by an anonymous reader: Facebook's methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. "If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent." For a very different take on the Facebook experiment, consider this defense of it from Tal Yarkoni, who thinks the criticism it's drawn is "misplaced": Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more. ... [H]aranguing Facebook and other companies like it for publicly disclosing scientifically interesting results of experiments that it is already constantly conducting anyway–and that are directly responsible for many of the positive aspects of the user experience–is not likely to accomplish anything useful. If anything, it’ll only ensure that, going forward, all of Facebook’s societally relevant experimental research is done in the dark, where nobody outside the company can ever find out–or complain–about it."
This discussion has been archived. No new comments can be posted.

Facebook's Emotion Experiment: Too Far, Or Social Network Norm?

Comments Filter:
  • One solution (Score:5, Insightful)

    by Anonymous Coward on Monday June 30, 2014 @06:37AM (#47348827)

    Just don't use social networking.

    • Re: (Score:3, Interesting)

      by flyneye (84093)

      But, think of the implications for upcoming elections! How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones? Control! The people obviously need controlled, they don't know what is good for them and the Repubmocrats always will.
      Since there is a market, Facebook, who covers most demographics, can help by raising tensions toward sinister interlopers in our one party political system.
      Look for upcoming "hour of hate" shows profiling

      • by BobMcD (601576)

        How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones?

        https://www.youtube.com/watch?... [youtube.com]

        It's called 'First Past the Post', and Facebook has nothing to do with it.

        • How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones?

          It's called 'First Past the Post', and Facebook has nothing to do with it.

          You're right that "first past the post" is a big part of the problem. But that's far from the whole story.

          For one thing, "Repubmocrats" do NOT have 100% of the power. It may be near 100%, but it isn't 100% -- for example currently [wikipedia.org] two U.S. senators, four major city mayors, and hundreds of state and local officials across the U.S. are elected independents or members of 3rd parties.

          That means that quite a few voters across the U.S. have actual experience in ELECTING someone who is not a member of the tw

      • Re:One solution (Score:4, Insightful)

        by s.petry (762400) on Monday June 30, 2014 @11:45AM (#47350641)

        Mostly this, but I believe it could have been worded differently for clarity.

        1. Like all institutions that are impacting on the public, Facebook and Twitter are being, and have been used, for propaganda. Guiding opinion happens from the time a kid enters Public school though to graduating from a University. We don't like to admit it, because it's frightening when you investigate the scope with which people are being indoctrinated. Reality can be a bummer, but it is reality.

        2. Just like with Television, there is a massive amount of social engineering with online media and content. How many "News" programs want you to "follow us on Twitter" and "Like us on Facebook"? All of them, and you will hear and see that phrase repeated over and over.

        3. Use of institutions like Education and Social media for purposes of "Social Engineering" is Propaganda in it's purest form and insidious. This is much worse than bread and circuses alone, because it constantly provides an opinion that someone want's you to have without any dialogue or discussion on the impact or morality of the opinion. I.E. Iraq war, Syria, candidates for public offices, etc...

        4. The fact that propaganda wars are waged against citizens happen is not new. It's been documented back as far as I can recall. Certain people owned Newspapers and provide an opinion. If a counter opinion is provided, it's generally squashed by ad hominem and when that doesn't work it simply won't be discussed on any mass media. This has been transferred to Radio, TV, and now Social media. The scale at which it's currently done is massive compared to when I was a kid.

        Investigating and learning about the psychology being used is like taking the red pill. A whole new world opens up to you, and you can see how much propaganda is being generated.

    • by jythie (914043)
      And yet here you are, here we all are, interacting on slashdot.
  • It's all perfectly harmless. CTos is here for you! [wikia.com] For those that haven't played the game, stop reading. With that one of the points in the game was "targeted reassignment" of vote predictions to get a mayor re-elected.

  • more interessting,.. (Score:5, Interesting)

    by Selur (2745445) on Monday June 30, 2014 @06:38AM (#47348833)

    it doesn't sound like this is the first experiment done by the facebook crowd -> What other experiments happened? Were the participants informed about it later? Who takes the blame if such an experiment results in someone getting hurt?

    • by N1AK (864906) on Monday June 30, 2014 @07:07AM (#47348915) Homepage
      Where do you draw the line? If Facebook realised that showing more negative stories (by monitoring what people already see) makes people more likely to click adverts is that really any better/worse than them artificially increasing/decreasing the amount of positive stories a user sees?

      If Google was having a hard time deciding if a page was junk or not, would it be unethical to put it in the results for some users and see how they react? Clearly that's an experiment without user knowledge, but it certainly doesn't sound like it's unethical to me and stopping that kind of experimentation or flooding sites with notices about them would make things better for users.

      Obviously there are experiments they could run that would be unethical if users weren't informed and monitored; discussing where the lines are and agreeing some best practices would therefore make sense.
    • by jythie (914043)
      Likely quite a few. It is unusual to publish (probably because the business value is low) but these types of tests are really common in marketing. That is why the IRB was fine with the research, it was just one of many experiments they run when trying to figure out what works 'best'.
  • I think it's fine (Score:3, Insightful)

    by Threni (635302) on Monday June 30, 2014 @06:39AM (#47348835)

    I love how overblown the coverage of this has been..as if it's driven people to suicide. It's their site, they can do what they want; people are free to leave if they want. Nothing to see here.

    • by buck-yar (164658)

      I think they didn't go far enough, more experiments like this should be done. Never before has such a database been compiled (other than NSA). Much can be learned.

    • by DarkOx (621550) on Monday June 30, 2014 @07:52AM (#47349035) Journal

      That is kinda my reaction as well. It seems this issue people have here is that facebook sought to manipulate peoples emotional state. The thing is that is exactly what just about every advertiser does all the time.

      Home Security System ads: clearly designed to make you feel vulnerable and threatened.

      Cosmetic surgery ads: clearly designed to make you feel inadequate.

      Beer ads: very often designed to make you feel less accepted, you need their product to be preceived as cool, ditto for clothing, and personal care products

      Political ads: feelings of security and family (at least if you pick their candidate)

      This list goes on...

      It might not have the same rigor as the academic world but they absolutely do focus group this stuff and find out how people 'feel' the marketers have researched what words, phrases, and imagery can best evoke these feelings. If what facebook did is illegal or even just unethical than so is pretty much everything the modern advertising industry has been up to for the past 70 years.

      I am sure many people would actually agree with that, but I don't see why its suddenly so shocking and deserving of attention just because facespace does it.

      • by sjbe (173966) on Monday June 30, 2014 @08:28AM (#47349159)

        The thing is that is exactly what just about every advertiser does all the time.

        No it is NOT the same thing. The beer company does not have any control over what *I* say and they do not get to (legally) change what I say or how it is delivered to others. There is a HUGE difference between putting a message out there and seeing how people react to it versus actually changing what you or I say and how it is delivered to someone else without my consent. The former is advertising which is fine as long as it isn't too intrusive. The later is a violation of personal sovereignty unless you obtain informed consent beforehand.

        Furthermore even if every advertiser actually did this (which they do not) and you have an ethical blind spot so large that you can't actually see what Facebook did wrong, two wrongs don't make a right. "Everyone else is doing it" is a juvenile argument that little kids make to justify behaviors that they shouldn't be engaging in.

        • by nine-times (778537) <nine.times@gmail.com> on Monday June 30, 2014 @09:30AM (#47349539) Homepage

          As far as I could tell from reading about this, they didn't change what people said.

          Here's the thing, Facebook already filters what you see with the default setup. Your 500 friends each post 10 posts today, and when you load up your page on a social networking site, the page only displays 15. So how are those 15 chosen? (I'm making up numbers here, obviously)

          The obvious choice would be to show the 15 most recent posts, but that means there's a good chance you'll miss posts that are important and that you'd like to see, since you're only getting a brief snapshot of what's going on in that social networking site. Facebook instead has an algorithm that tries to determine which of those 5,000 posts you'll care most about. I don't know the specifics, but it includes things like favoring the people who you interact with most on Facebook.

          So what Facebook did in this study is they tweaked that algorithm to also favor posts that included negative words. The posts were still from that 5,000 post pool and the contents of the posts were unedited, but they subjected you to a different selection in order to conduct the research.

          It's still an open question as to whether this sort of thing is appropriate, but it's important to note that this is something Facebook does all the time anyway. I think where is gets creepy is that Facebook is also an ad-driven company, so you have to wonder what the eventual goal of this research is. I can imagine Facebook favoring posts that include pictures of food to go along with an ad campaign for Seamless. Maybe they'll make a deal with pharmaceutical companies to adjust your feed to make you depressed, while at the same time plastering your feed with ads for antidepressants.

          • As far as I could tell from reading about this, they didn't change what people said.

            Yes they did. There is more to communication than the specific words used. Tone, timing, delivery, emphasis, etc all are part of the message. If Facebook altered any of these to be different from the expectations of the user without informing them beforehand then they changed what people said. There is MUCH more to human communication than the syntax used.

            It's still an open question as to whether this sort of thing is appropriate

            I disagree. I don't think it is an open question at all. How Facebook did what they did is unacceptable. Doing experiments like this is fine in pri

            • Yes they did. There is more to communication than the specific words used. Tone, timing, delivery, emphasis, etc all are part of the message. If Facebook altered any of these to be different from the expectations of the user without informing them beforehand then they changed what people said. There is MUCH more to human communication than the syntax used.

              I'm not sure how you think they changed the tone, timing, delivery, or emphasis of the messages. Apparently they used real posts and posted the entire content of each post without alteration. From what I understand, though I'm interpreting from a few different stories that I read, all they did was to alter the algorithm that Facebook already uses to choose which posts to show in your feed. They didn't insert or remove words from the posts. They didn't do anything to really re-contextualize them.

              Whether

        • by martas (1439879)

          "Everyone else is doing it" is a juvenile argument that little kids make to justify behaviors that they shouldn't be engaging in.

          Ugh, didn't you disgust yourself while typing that out? There was a lot more to OP's argument than that, as you very well know. And that's on top of the Olympian leap it must have taken to claim that a private company tweaking the information filtering algorithms for their entirely optional leisure service can constitute a violation of personal sovereignty, which is a concept more commonly reserved for discussions on issues like indentured servitude...

      • For the cognitively impaired: Advertising is is identified as advertising. The experiment that was conducted was not identified, people did not know this was occurring. The "researches" did not get informed consent - which you are required. The "researches" claimed the EULA was the informed consent.
        • by DarkOx (621550)

          Advertising is not always identified. How often have you gotten a letter, that is designed to look like an insurance invoice or bank check. Many look official enough I have spend at least 20 seconds deciding if its something I really need to act upon. All kinds of advertisers take out full page ads and do their damnedest to disguise them as articles in print magazines and news papers.

          Sure there is always some fine print somewhere that says "advertisement" but then I would argue facebooks EULA qualifies a

        • by martas (1439879)
          So then is product placement in a movie or TV show unethical?
      • by Tim the Gecko (745081) on Monday June 30, 2014 @08:40AM (#47349227)

        Facebook uses psychology to make minor changes in our happiness... Something must be done!

        Soda companies use psychology to sell huge buckets of sugar water... Hands off our soda, Mayor Bloomberg!

        • Facebook uses psychology to make minor changes in our happiness, at the expense of our friends, destroying relationships

          FTFY. I know it's rare amongst /. readers, but I have friends that aren't the 5 fingers on my right hand, and if they're sad or upset, I fucking well want to know so I can be there for them and help them.

          • by asylumx (881307)
            If your relationships with your friends are solely (or even largely) based on Facebook, you're doing it wrong.
          • by Darinbob (1142669)

            But Facebook has never provided 100% of all posts of all friends/non-friends/acquaintances in your feed. There's a wide variety of stuff there and they pick and choose. I don't use facebook so I don't know all of what's there, but on G+ the list of "what's hot" clearly doesn't include everything, if I reload the page I get a different list of items appearing, and sometimes I do have to go directly to a friend's page to see some recent posts.

            There's no evidence given here that Facebook decided not to show

    • That's what I thought when I read it too. I wonder if Slashdot did an a/b test with its moderation system and did some sentiment analysis on the resulting comments, would there be the same outrage?

    • I love how overblown the coverage of this has been..as if it's driven people to suicide.

      Worked beautifully, hasn't it? It's all about Facebook letting stockholders and advertisers what they are doing to improve the value of the PRODUCT (a.k.a. "users") to maximize revenue. Outrage from the product only serves to prove its effectiveness.

  • Shock and awe (Score:3, Informative)

    by Stumbles (602007) on Monday June 30, 2014 @06:43AM (#47348849)
    Not really, not in the least bit should anyone be surprised. Some years ago when Zuckerburg was asked about facebook users data, his reply; they are fucking idiots to trust him.
    • by s.petry (762400)

      This is probably the most important consideration for people to understand. Do you really think that the people pulling the strings give a rats ass about individuals? To people like Zuckerberg, you are something to be exploited for whatever purpose they wish.

      So now that we know that they have done this once, I simply wonder how many more experiments they have done without disclosure. I have a hard time believing this is the first, or the last.

      As a first guess, the whole "timeline" feature that was forced

  • This is "social", the fucking pox of the internet, we are talking about here: "too far" and "social network norm" are usually synonymous...

    In a way, though, the fact that it doesn't go even further too far helps make it pathetic: what happens on 'social' services are the ethical transgressions of our best and brightest, equipped with nigh-unlimited funds, the assurance that they are Just That Good, and that 'disruption' is the ultimate virtue, and yet their imaginations seem to extend no further than bei
  • A/B-Testing (Score:5, Insightful)

    by bickerdyke (670000) on Monday June 30, 2014 @06:57AM (#47348891)

    I understand why this should be considered wrong and fully understand users who don't want to have someone (less some company!) playing with their feelings.

    But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing? 50% of your website users will see a slightly altered version of your website, and you compare response rates to the users receiving the "old" or "original" website.

    Advertisers are manipulating our feelings for decades.News outlets have been doing it to an extent it became part of the news format itself (I guess anyone who was watching tv news last night saw that light-hearted, cozy, human-intrest or slightly oddball or cute item concluding the broadcast, right?) While creating negative feelings toward someone else has always been used in political campaigns.

    It even becomes less spectacular if you consider, that on facebook, there always has been a selection algorithm in place, that tried to select those items from all your facebook-sources, that might keep your intrest focused onto facebook. Without selection, your facebook would scroll past like the Star Wars end titles. Only the parameters of the selection have been fine tuned, as they probably are at each facebook server update. It would be some new quality if that selection had been "objective" before, but being "personal" and emotional instead, is what kept us at facebook already.

    So this is old news. But it should be a wake-up call: WAKE UP, THIS IS OLD NEWS! PEOPLE ARE TRYING TO MANIPULATE YOUR FEELINGS FOR AGES!

    Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

    • by Sockatume (732728)

      It becomes a problem when you involve actual academic research staff. Private companies can do whatever the heck they like outside of a hospital, but researchers engaging in interventions are required to meet certain ethical standards as a matter of professional norm and quite often as a binding condition of any funding they have received.

    • But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing?

      Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line. If you are a beer company, you can try to promote your product to me in a way that you think might make me more inclined to buy it and that is fine as long as you aren't overly intrusive about it (think telemarketers). What is NOT fine is for them to take what I say and manipulate that to try to convince me

      • by bickerdyke (670000) on Monday June 30, 2014 @08:18AM (#47349121)

        Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line.

        But according to /., not what happend here.

        According to this article here [slashdot.org], no messages were changed:

        Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone

        (emphasis mine).

        I agree with you that changing the actual messages would not be acceptable by any standard.

        Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

        Then you do not understand what is going on. Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable. Screwing around with people's emotions in a controlled experiment should require at minimum review by a genuinely independent ethical review board and probably genuine informed consent. Facebook could be bothered with neither one. They seem to regard their users as insects to be manipulated and dissected.

        And again I agree with you that you're stepping over a line when you're consciously manipulating people's feelings for economic reasons. But this line is crossed thousandfold already. The type of environment is secondary. A/B-tests take place in controlled environments, too.

        • According to this article here [slashdot.org], no messages were changed:

          If ANYTHING about the message is altered including delivery schedule, mix of content, etc then they are altering the message. Not everything about a message is the simple content. When you send a message and the tone you use is every bit as important to correct interpretation by the recipient. Facebook altered the messages without actually changing the specific content. If the message was unaltered (including delivery, tone, timing, etc) then we would expect reactions to be identical.

          But this line is crossed thousandfold already.

          Even if true (which

          • According to this article here [slashdot.org], no messages were changed:

            If ANYTHING about the message is altered including delivery schedule, mix of content, etc then they are altering the message.

            Please define "message".

            It may refer to an item in your facebook stream. In which case, nothing in the messages has been altered.

            Or "message" may refer to the the facebook stream as a whole, made up of the smaller individual message items by your friends and/or advertisers.

            In that case, facebook is the sender of the message and the "message" always has been subject to facebook picking news items. We basically had more than one algorithm (or parameter sets for the same alogrithm) that picked those messages.

        • But this line is crossed thousandfold already. The type of environment is secondary.

          So your point is everybody's doing it, so it's OK?

          That excuse didn't fly with anyone I knew growing up. I guess folks like you have forgotten some basic childhood lessons.

      • by DarkOx (621550)

        Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable.

        Yes actually it does make it acceptable because the people doing the experiment knew that it was very very unlikely to cause anyone serious injury. When an psychological experiment amounts to no more than making people aware their buddies had a shitty day at work by their own account, I don't think it actually rises to the level of requiring consent.

        Humans conduct experiments all the time; its how any self aware being interacts with the world around them. It just on a small scale so nobody cares, I bet pl

    • The issue is clear; if a doctor or psychologist tried this, they would have to get IRB approval. You need informed consent; such laws were passed after psychologists had tried a LOT of experiments on the unwitting public; simluating muggings, imminent death scenarios, etc.

      I know people say "it's just manipulating feeds, what's the harm?" There can be plenty of harm if you manipulate the feeds. Where is the line? What if facebook had decided to see what happens if you try showing depressing posts and bad new

      • The issue is clear; if a doctor or psychologist tried this, they would have to get IRB approval. You need informed consent; such laws were passed after psychologists had tried a LOT of experiments on the unwitting public; simluating muggings, imminent death scenarios, etc.

        Yes. And I agree with you.

        I never said it was or should be accepted, I said it was widespread. And that in marketing, emotional manipulation is even out of the experimental stage.

      • Re:A/B-Testing (Score:4, Interesting)

        by Trepidity (597) <delirium-slashdot@@@hackish...org> on Monday June 30, 2014 @08:50AM (#47349261)

        Apparently they did actually get IRB approval, oddly enough. The study was jointly done with two universities, and from what other researchers have told me, the two universities' IRBs approved the protocol. I'm surprised myself that they would. Would be curious to see what their reasoning was.

        • Now that IS very interesting. I wonder how the IRB approved an experiment that clearly didn't have any participants' consent.

        • The reasoning was the same as what many are saying (IMO, incorrectly) here - that FB was already manipulating feeds so it was OK. I find this reasoning specious because, normally, FB modifies what it shows to attempt to change a narrow behavior with relatively finite consequences - whether a user clicks on an ad or not - while with this experiment the researchers were trying to alter something much more broad - a person's entire emotional state - a change with much broader implications. Given what we know f

    • It's different from A/B testing in that the experiment is explicitly designed to cause harm to half of the participants.

      Presumably most A/B testing would be designed to figure out which choice performs better on a set of metrics. But going in, there is little evidence to point to one or the other, and the "harm" caused would simply be in user experience. In this experiment, the researchers had a prior theory about which choice would cause harm, and the harm is emotional and psychological.

      All that aside, if

      • Agreed.

        But intresting enough, according to one of those news articles I read about that issue today, one of the potential harm that was supposed to be subject of the experiment was feeling left out by too many positive news about their friends.(*)

        May be BS, but may indeed be a valid and intresting theory, too.

        (*) That statement should have at least 6 pairs of "quotes" around certain "words". I left them out for readability.

  • Too far? (Score:5, Insightful)

    by ebonum (830686) on Monday June 30, 2014 @07:08AM (#47348919)

    What about what advertisers do every day?
    Our government (for us Americans) runs campaigns to alter opinions in other countries.
    I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.
    Beer companies anyone?

    • What about what advertisers do every day?

      What about them? They don't get to run controlled experiments on me and they certainly do not get to alter what I say or how others receive what I say. Advertisers can control what they say to me and see how I react but they don't get to manipulate what I say and see how that affects others. HUGE difference.

      Our government (for us Americans) runs campaigns to alter opinions in other countries.

      They don't get to adjust what *I* say to see what effect it has on others. You really can't see the difference?

      I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.

      When they are performing a controlled experiment on me then yes they should. If they w

  • "If you are exposing people to something that causes changes in psychological status, that's experimentation,"

    No it isn't, otherwise the above sentence would be experimentation, as it changed my psychological state from calm to annoyed. Is it too much to ask that supposed experts use their own jargon correctly?

  • by by (1706743) (1706744) on Monday June 30, 2014 @07:10AM (#47348923)
    According to the WSJ's coverage http://online.wsj.com/articles... [wsj.com] ,

    The impetus for the study was an age-old complaint of some Facebook users: That going on Facebook and seeing all the great and wonderful things other people are doing makes people feel bad about their own lives.

    So although conventional wisdom might say that seeing positive things makes you happier, here there have been accusations to the contrary -- positive things about other people makes you feel lousy about yourself. This study ostensibly looked at that (and I think it found something along the lines of conventional wisdom: happy posts make you post happy stuff, a [dubious!] proxy for your own happines...).

    If Facebook knew (and how would they?) that X makes you depressed, then yes...there might be some moral issues with that. But it seems that Facebook asked a legitimate question -- especially so given that it was published in PNAS.

    That said, yeah...it feels a little shady. But then, when I log onto Facebook, I am certainly not expecting any aspect of the website to be designed with my best interests in mind!

  • Advertising frequently uses psychological pressure (for example, appealing to feelings of inadequacy) on the intended consumer, which may be harmful.

    Criticism of advertising [wikipedia.org]

    ...was my 1st thought when reading...

    "If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent."

    One could argue that advertising is not always done with informed consent.

  • by sjbe (173966) on Monday June 30, 2014 @07:29AM (#47348971)

    Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much...

    If this guy actually thinks nobody complains about this much then he isn't paying attention. However putting that aside his argument is a straw man. There is a VERY significant difference between changing a service and that change having an emotional impact versus actually experimenting on the emotions of your customers directly and without their permission without even so much as review by an independent review board. Anyone who can't comprehend the difference between the two has a pretty big ethical blind spot. The fact that Facebook seems to be genuinely surprised by this response tells me everything I need to know about how they regard their users. They see them the same way an entomologist sees bugs - something to be cataloged and experimented on but not worthy of the respect one normally gives other human beings.

    –presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more

    There is a big and fairly bright line between observing users behavior given certain stimuli as a natural experiment [wikipedia.org] and the experimental investigators manipulating those users directly without their permission in a designed experiment. The later generally requires informed consent [wikipedia.org] for a variety of very sensible reasons relating to ethics. The fact that emotional manipulation is done in other contexts is utterly irrelevant. That's the same argument children make when they claim that "...but all my friends are doing it too". I suppose since Facebook is owned and run by an immature child billionaire that I shouldn't be surprised.

    And no, the Facebook terms of use does NOT rise to the level of informed consent.

    • by fey000 (1374173)

      The fact that Facebook seems to be genuinely surprised by this response tells me everything I need to know about how they regard their users. They see them the same way an entomologist sees bugs - something to be cataloged and experimented on but not worthy of the respect one normally gives other human beings.

      And how can this surprise you? Have you ever heard anything at all about Facebook respecting the privacy of its users? In fact, again and again and again Facebook ends up in the news with an anti-privacy scandal on its hands.

      I am not saying that running social experiments on random people is a great idea (though it is funny), I am saying this is a 'no biggie' because it is neither surprising nor out of line with previous actions. That doesn't make it right, but anyone with half a brain should have seen it c

  • I'm a postdoc at university, though not in a field in which you usually study human behavior. Anyway, if I experminted on humans without their prior consent, I'd loose my job. In every application for a project that involves studies on animals or humans there is an ethics form to fill out, and I must wonder how they got funding without cheating in one of those forms.

    Lying to tests subjects is to some extent necessary, of course, or otherwise research in pschology would be almost impossible. However, conduct

    • I agree, this was my first thought. They screwed up big time, it would be fun to see the federal government investigate them for unlicensed human research.

  • by Registered Coward v2 (447531) on Monday June 30, 2014 @07:31AM (#47348981)
    Facebook didn't simply set out to make tweaks and see how users responded; they setup a controlled experiment on subjects without their consent; a practice that appears to violate ethical and possibly legal guidelines for behavioral research. I agree it could push them to continue to do such research and not reveal it; but when it inevitably leaks that they are doing that it will create a PR nightmare. Facebook could have simply asked people to opt in to the study and provide the standard information regarding the study and this would be a non-issue. For those looking for info on humane research protection guidelines in the US google Office of Human Research Protection.
    • by Trepidity (597)

      without their consent

      What's actually more problematic to me is that the paper explicitly claimed they asked for and received "informed consent". But their justification is that users agreed to the Facebook EULA. That is a serious misunderstanding of what constitutes informed consent in research ethics; it does not just mean that someone agreed to some fine print, possibly months ago, in a transaction unrelated to the current study.

      If they want to argue that this doesn't require informed consent at all, beca

      • without their consent

        What's actually more problematic to me is that the paper explicitly claimed they asked for and received "informed consent". But their justification is that users agreed to the Facebook EULA. That is a serious misunderstanding of what constitutes informed consent in research ethics; it does not just mean that someone agreed to some fine print, possibly months ago, in a transaction unrelated to the current study.

        If they want to argue that this doesn't require informed consent at all, because it's e.g. just data mining of effectively existing data, that would be less problematic imo than watering down the standard for informed consent to include EULAs.

        I agree, with an added thought. It wasn't just data mining but a controlled experiment that altered the data they received. That, IMHO, cross the line between "let's look at the existing data" to "let's conduct an experiment."

        Another part to his argument seem to be "the impact was so small as too be negligible and thus it was OK." However, the researchers did not know the results would be negligible so using that as an excuse after the fact doesn't fly.

    • by PvtVoid (1252388)

      Facebook didn't simply set out to make tweaks and see how users responded; they setup a controlled experiment on subjects without their consent; a practice that appears to violate ethical and possibly legal guidelines for behavioral research.

      Bingo. Advertisers may do this sort of thing all the time, but they don't get it published in peer-reviewed scientific journals without adhering to standard human research protocols. PNAS should immediately retract the article, and the researchers involved should be censured and stripped of funding.

      And people who don't want to be experimented on without consent should just fucking quit using Facebook.

  • For real, THIS issue bothers FB users? I'm speechless, you never know what's going going to matter to someone.
  • by CodyRazor (1108681) on Monday June 30, 2014 @08:01AM (#47349067) Homepage

    The problem is not that they attempted to create an emotional response or manipulate people's emotions. As people are constantly pointing out advertisers so that all the time. People don't seem to grasp that there is a large difference between this and advertising.

    The problem is the way it was done. People use facebook with the expectation that they are seeing a (reasonably) objective representation of what their friends are trying to express or convey. Facebook is the equivalent of the telephone in a telephone call. If the telephone somehow manipulated what you heard to make your friend sound more negative or positive without changing their core meaning that would be unethical without informed consent, just as this is.

    A more extreme version would be facebook subtly modifying the content of what your friends post as it appears to you without anyone knowing it was doing this. That would be even more unethical. The problem is mirepresentation, the method by which they attempt to manipulate emotions.

    • by martas (1439879)

      People use facebook with the expectation that they are seeing a (reasonably) objective representation of what their friends are trying to express or convey. Facebook is the equivalent of the telephone in a telephone call.

      That claim would make sense if people commonly held telephone conversations with hundreds of people simultaneously who say things continuously all day long. There are plenty of forums on the Internet that display information based on simple rules like "most recent post at the top". But as long as you, your family, and just about everyone else in the country are using Facebook instead of one of those forums, then the only thing you're complaining about regarding this story is that they, for once, decided to

  • by Anonymous Coward on Monday June 30, 2014 @08:31AM (#47349169)

    I talked to several (non-tech) friends about this, and they were more upset about Facebook "censoring" out posts than the emotional manipulation. In their minds, Facebook allows everything to be shown, but certain topics gain preference due to likes or dislikes. However, they will show you everything if you scroll far enough.

    Their outrage came from the thought that FB was removing "happy" content from their feed. (That it was no longer a "dumb" pipe for social data).

  • They conducted a psychology experiment without the consent of the test subjects. I'm not sure what the rules are for private organizations, but I do believe that any publicly funded researcher involved in the experiment, or possibly those who use the results, would be at risk of losing all federal funding. I really hope some lawsuits are filed against facebook and any of the researchers because this shouldn't creep in to becoming an accepted norm.

  • by swb (14022) on Monday June 30, 2014 @09:25AM (#47349489)

    I quit using Facebook six months ago, but for a couple of years was a regular user.

    The "newsfeed" always struck me as enormously manipulated, with Facebook constantly altering the algorithm that determines what you're shown. Even nontechnical users would comment about this, wondering why they didn't see some posts from some people some times.

    Some of this may have been benign, trying to figure out what order to display posts relative to relationships, posting frequency, sort of ordinary attempts to sort out "importance".

    But I'm sure there was commercial manipulation -- ranking user comments with links to advertising-affiliated sites higher than non-affiliated sites, downranking links to sites likely to lead a person to shorten their Facebook session, etc.

    All of this could be considered "manipulation" even though there might not be one single motivation behind it and not all the factors may be even focused on a specific outcome.

    • by asylumx (881307)
      I have to agree. The comments on these articles is much ado about nothing. This is news for nerds and the nerd response should be followup questions. What did the research show? What does that mean about humans? Did people from different cultures and backgrounds react differently? etc.

      The responses we see here are less nerd-like and more political.
      • by drinkypoo (153816)

        The comments on these articles is much ado about nothing.

        Nonsense. I quit facebook for the same reason, but this is still substantively different. This was deliberate manipulation of mood solely for the purpose of study. Granted, what Facebook normally does is also horrible, maybe even moreso. After all, if the purpose is either to sell you more shit you don't need, or to manipulate political speech, either way they're being downright evil.

  • by Opportunist (166417) on Monday June 30, 2014 @09:36AM (#47349595)

    There is no fine line here. There's only a bold one. Does it involve humans? If so, not only is tight ethic supervision required (to avoid a Milgram scenario) but, and that's the even more important part, the active and willing consent of the participating people is required.

    Anything else, no matter how "trivial" it may be seen, is simply and plainly wrong. And no, some clause somewhere hidden between another few billion lines of legalese in an EULA does NOT constitute consent to being a guinea pig!

  • Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more.

    Back when I was on Facebook, it seemed like every change they made was designed to make me want to use its product less. So much so that I eventually asked them to delete my account.

    • Back when I was on Facebook, it seemed like every change they made was designed to make me want to use its product less. So much so that I eventually asked them to delete my account.

      I didn't think that was "designed". I thought it was just their ineptness.

  • I read Google News because it gives several different media outlets' spin on the same story. But you need to be aware of which sites are listed and seek out coverage from the other side. They tend to give higher weight to liberal leaning media when the story is a topic liberals are more focused on. For example, the three outlets on today's SCOTUS decision against labor unions are USAToday, LA Times, and NBC, with a Huff Post opinion piece right under that list. One can assume that's just an artifact of Goog

  • When they start seeing if they can manipulate the murder or suicide rates, THEN we can talk about ethics. Until then, hey, anything goes.
  • why I can't get the song "Happy" out of my head.

I bet the human brain is a kludge. -- Marvin Minsky

Working...