Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Social Networks Facebook Stats

Facebook's Emotion Experiment: Too Far, Or Social Network Norm? 219

Facebook's recently disclosed 2012 experiment in altering the tone of what its users saw in their newsfeeds has brought it plenty of negative opinions to chew on. Here's one, pointed out by an anonymous reader: Facebook's methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. "If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent." For a very different take on the Facebook experiment, consider this defense of it from Tal Yarkoni, who thinks the criticism it's drawn is "misplaced": Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more. ... [H]aranguing Facebook and other companies like it for publicly disclosing scientifically interesting results of experiments that it is already constantly conducting anyway–and that are directly responsible for many of the positive aspects of the user experience–is not likely to accomplish anything useful. If anything, it’ll only ensure that, going forward, all of Facebook’s societally relevant experimental research is done in the dark, where nobody outside the company can ever find out–or complain–about it."
This discussion has been archived. No new comments can be posted.

Facebook's Emotion Experiment: Too Far, Or Social Network Norm?

Comments Filter:
  • One solution (Score:5, Insightful)

    by Anonymous Coward on Monday June 30, 2014 @06:37AM (#47348827)

    Just don't use social networking.

  • I think it's fine (Score:3, Insightful)

    by Threni ( 635302 ) on Monday June 30, 2014 @06:39AM (#47348835)

    I love how overblown the coverage of this has been..as if it's driven people to suicide. It's their site, they can do what they want; people are free to leave if they want. Nothing to see here.

  • by astro ( 20275 ) on Monday June 30, 2014 @06:49AM (#47348873) Homepage

    Bullshit. How do you know that you don't know anyone that was affected by it? Do you know which week in 2012 the experiment was conducted? Do you know which of the ~billion FB accounts were the 700k experimented upon? I find it pretty shocking that so many people are having difficulty understanding the difference between A/B testing and intentional emotional manipulation where a significant negative (or positive) result was the data point the study strove to measure.

    I can quite imagine that a significant number of offline lives were impacted by this experiment. People exposed to negative content presumably don't limit their negative reactions to behavior only in the venue where they were exposed to the negative content.

  • by Anonymous Coward on Monday June 30, 2014 @06:57AM (#47348889)

    Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.

    Do you PERSONALLY know anyone who was affected by warrantless wire tapping? Neither do I.

    As long as they never admit who it happened to, so that nobody can know whether it happened to them, then we're good? Look, there probably isn't anyone alive today (and certainly not on thus website) who knew Little Albert but that doesn't make the experiment that was done to him any less unethical.

    Facebook's TOS can obtain consent, but it can never obtain informed consent.

  • A/B-Testing (Score:5, Insightful)

    by bickerdyke ( 670000 ) on Monday June 30, 2014 @06:57AM (#47348891)

    I understand why this should be considered wrong and fully understand users who don't want to have someone (less some company!) playing with their feelings.

    But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing? 50% of your website users will see a slightly altered version of your website, and you compare response rates to the users receiving the "old" or "original" website.

    Advertisers are manipulating our feelings for decades.News outlets have been doing it to an extent it became part of the news format itself (I guess anyone who was watching tv news last night saw that light-hearted, cozy, human-intrest or slightly oddball or cute item concluding the broadcast, right?) While creating negative feelings toward someone else has always been used in political campaigns.

    It even becomes less spectacular if you consider, that on facebook, there always has been a selection algorithm in place, that tried to select those items from all your facebook-sources, that might keep your intrest focused onto facebook. Without selection, your facebook would scroll past like the Star Wars end titles. Only the parameters of the selection have been fine tuned, as they probably are at each facebook server update. It would be some new quality if that selection had been "objective" before, but being "personal" and emotional instead, is what kept us at facebook already.

    So this is old news. But it should be a wake-up call: WAKE UP, THIS IS OLD NEWS! PEOPLE ARE TRYING TO MANIPULATE YOUR FEELINGS FOR AGES!

    Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

  • Too far? (Score:5, Insightful)

    by ebonum ( 830686 ) on Monday June 30, 2014 @07:08AM (#47348919)

    What about what advertisers do every day?
    Our government (for us Americans) runs campaigns to alter opinions in other countries.
    I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.
    Beer companies anyone?

  • Re:A/B-Testing (Score:5, Insightful)

    by TheRaven64 ( 641858 ) on Monday June 30, 2014 @07:12AM (#47348929) Journal
    A significant amount of marketing is intended to cause harm to 100% of the user-base, so being ethically unsound doesn't appear to be a problem.
  • by DarkOx ( 621550 ) on Monday June 30, 2014 @07:52AM (#47349035) Journal

    That is kinda my reaction as well. It seems this issue people have here is that facebook sought to manipulate peoples emotional state. The thing is that is exactly what just about every advertiser does all the time.

    Home Security System ads: clearly designed to make you feel vulnerable and threatened.

    Cosmetic surgery ads: clearly designed to make you feel inadequate.

    Beer ads: very often designed to make you feel less accepted, you need their product to be preceived as cool, ditto for clothing, and personal care products

    Political ads: feelings of security and family (at least if you pick their candidate)

    This list goes on...

    It might not have the same rigor as the academic world but they absolutely do focus group this stuff and find out how people 'feel' the marketers have researched what words, phrases, and imagery can best evoke these feelings. If what facebook did is illegal or even just unethical than so is pretty much everything the modern advertising industry has been up to for the past 70 years.

    I am sure many people would actually agree with that, but I don't see why its suddenly so shocking and deserving of attention just because facespace does it.

  • by PolygamousRanchKid ( 1290638 ) on Monday June 30, 2014 @07:56AM (#47349049)

    1. Look up what wacky crimes were committed in January 2012.
    2. Blame them on Facebook.
    3. Sue.
    4. Profit!.

    In January 2012 a bunch of kids formed the Islamic Caliphate of the Rusted Chevy on Cinder Blocks on my front lawn, despite their parents instructing them: "You best be staying away from Mr. Kid, he ain't right in the head. "

    Obviously Facebook manipulation caused this.

  • by sjbe ( 173966 ) on Monday June 30, 2014 @08:01AM (#47349071)

    Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.

    Nobody knows who was affected or exactly how. That's part of the problem. They did it without knowledge or consent. They did not inform people of what they were doing or the fact that they did it after the fact. They did not have their design of experiment reviewed by an independent ethics board. They violated the (misplaced) trust their users had to deliver their messages as the users intended.

    This isn't legal documents we're talking about here, anyway. I'm also pretty sure this is covered under Facebook's EULA/TOS you didn't read.

    NOTHING in Facebook's TOS remotely qualifies as informed consent to be experimented upon. I don't even have to read it to know that. It's not THAT they did this experiment, it is HOW they did this experiment. It's not hard to check the experiment proposal in front of an ethics panel. It's not hard to get informed consent if that is deemed appropriate by the ethics panel. It is standard practice to do those things for some very very good reasons. Facebook couldn't be bothered.

  • by sjbe ( 173966 ) on Monday June 30, 2014 @08:28AM (#47349159)

    The thing is that is exactly what just about every advertiser does all the time.

    No it is NOT the same thing. The beer company does not have any control over what *I* say and they do not get to (legally) change what I say or how it is delivered to others. There is a HUGE difference between putting a message out there and seeing how people react to it versus actually changing what you or I say and how it is delivered to someone else without my consent. The former is advertising which is fine as long as it isn't too intrusive. The later is a violation of personal sovereignty unless you obtain informed consent beforehand.

    Furthermore even if every advertiser actually did this (which they do not) and you have an ethical blind spot so large that you can't actually see what Facebook did wrong, two wrongs don't make a right. "Everyone else is doing it" is a juvenile argument that little kids make to justify behaviors that they shouldn't be engaging in.

  • Re:One solution (Score:5, Insightful)

    by rmdingler ( 1955220 ) on Monday June 30, 2014 @08:32AM (#47349177) Journal
    Some people have little interest in happiness and a good mood... you see it every day.

    Within and without social media, people, events, results, and happenstance conspire to alter your mood each and every day... something that cannot happen without your tacit permission. Grow a thicker skin and remember that yelling at that jerk in traffic means you've allowed a complete stranger power over your behavior.

    If giving up social media is too big a first step, don't go in with your eyes wide shut: you are the product, not the customer.

  • by Tim the Gecko ( 745081 ) on Monday June 30, 2014 @08:40AM (#47349227)

    Facebook uses psychology to make minor changes in our happiness... Something must be done!

    Soda companies use psychology to sell huge buckets of sugar water... Hands off our soda, Mayor Bloomberg!

  • by danudwary ( 201586 ) on Monday June 30, 2014 @09:10AM (#47349373)

    More PUBLISHED experiments, though, please. Let's know what they're doing, and what the outcomes are.

  • by nine-times ( 778537 ) <nine.times@gmail.com> on Monday June 30, 2014 @09:30AM (#47349539) Homepage

    As far as I could tell from reading about this, they didn't change what people said.

    Here's the thing, Facebook already filters what you see with the default setup. Your 500 friends each post 10 posts today, and when you load up your page on a social networking site, the page only displays 15. So how are those 15 chosen? (I'm making up numbers here, obviously)

    The obvious choice would be to show the 15 most recent posts, but that means there's a good chance you'll miss posts that are important and that you'd like to see, since you're only getting a brief snapshot of what's going on in that social networking site. Facebook instead has an algorithm that tries to determine which of those 5,000 posts you'll care most about. I don't know the specifics, but it includes things like favoring the people who you interact with most on Facebook.

    So what Facebook did in this study is they tweaked that algorithm to also favor posts that included negative words. The posts were still from that 5,000 post pool and the contents of the posts were unedited, but they subjected you to a different selection in order to conduct the research.

    It's still an open question as to whether this sort of thing is appropriate, but it's important to note that this is something Facebook does all the time anyway. I think where is gets creepy is that Facebook is also an ad-driven company, so you have to wonder what the eventual goal of this research is. I can imagine Facebook favoring posts that include pictures of food to go along with an ad campaign for Seamless. Maybe they'll make a deal with pharmaceutical companies to adjust your feed to make you depressed, while at the same time plastering your feed with ads for antidepressants.

  • by Opportunist ( 166417 ) on Monday June 30, 2014 @09:36AM (#47349595)

    There is no fine line here. There's only a bold one. Does it involve humans? If so, not only is tight ethic supervision required (to avoid a Milgram scenario) but, and that's the even more important part, the active and willing consent of the participating people is required.

    Anything else, no matter how "trivial" it may be seen, is simply and plainly wrong. And no, some clause somewhere hidden between another few billion lines of legalese in an EULA does NOT constitute consent to being a guinea pig!

  • Re:One solution (Score:3, Insightful)

    by Impy the Impiuos Imp ( 442658 ) on Monday June 30, 2014 @09:54AM (#47349735) Journal

    Wow that was an impressive misspelling of "surgical".

  • by tomhath ( 637240 ) on Monday June 30, 2014 @10:49AM (#47350189)

    I read Google News because it gives several different media outlets' spin on the same story. But you need to be aware of which sites are listed and seek out coverage from the other side. They tend to give higher weight to liberal leaning media when the story is a topic liberals are more focused on. For example, the three outlets on today's SCOTUS decision against labor unions are USAToday, LA Times, and NBC, with a Huff Post opinion piece right under that list. One can assume that's just an artifact of Google's ranking based on page views since libs are more likely to read about the story on left leaning outlets, and for conservatives this is kind of a "well duh" decision.

    CNN seems to send a subliminal message in their placement of stories; good news stories for Democrats and bad news stories for Republicans tend to be given more prominent coverage. More obviously biased outlets like USAToday, NBC, Huff Post, and Fox don't try to hide it.

  • Re:One solution (Score:4, Insightful)

    by s.petry ( 762400 ) on Monday June 30, 2014 @11:45AM (#47350641)

    Mostly this, but I believe it could have been worded differently for clarity.

    1. Like all institutions that are impacting on the public, Facebook and Twitter are being, and have been used, for propaganda. Guiding opinion happens from the time a kid enters Public school though to graduating from a University. We don't like to admit it, because it's frightening when you investigate the scope with which people are being indoctrinated. Reality can be a bummer, but it is reality.

    2. Just like with Television, there is a massive amount of social engineering with online media and content. How many "News" programs want you to "follow us on Twitter" and "Like us on Facebook"? All of them, and you will hear and see that phrase repeated over and over.

    3. Use of institutions like Education and Social media for purposes of "Social Engineering" is Propaganda in it's purest form and insidious. This is much worse than bread and circuses alone, because it constantly provides an opinion that someone want's you to have without any dialogue or discussion on the impact or morality of the opinion. I.E. Iraq war, Syria, candidates for public offices, etc...

    4. The fact that propaganda wars are waged against citizens happen is not new. It's been documented back as far as I can recall. Certain people owned Newspapers and provide an opinion. If a counter opinion is provided, it's generally squashed by ad hominem and when that doesn't work it simply won't be discussed on any mass media. This has been transferred to Radio, TV, and now Social media. The scale at which it's currently done is massive compared to when I was a kid.

    Investigating and learning about the psychology being used is like taking the red pill. A whole new world opens up to you, and you can see how much propaganda is being generated.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...