Facebook's Emotion Experiment: Too Far, Or Social Network Norm? 219
Facebook's recently disclosed 2012 experiment in altering the tone of what its users saw in their newsfeeds has brought it plenty of negative opinions to chew on. Here's one, pointed out by an anonymous reader: Facebook's methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. "If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent."
For a very different take on the Facebook experiment, consider this defense of it from Tal Yarkoni, who thinks the criticism it's drawn is "misplaced": Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more. ... [H]aranguing Facebook and other companies like it for publicly disclosing scientifically interesting results of experiments that it is already constantly conducting anyway–and that are directly responsible for many of the positive aspects of the user experience–is not likely to accomplish anything useful. If anything, it’ll only ensure that, going forward, all of Facebook’s societally relevant experimental research is done in the dark, where nobody outside the company can ever find out–or complain–about it."
One solution (Score:5, Insightful)
Just don't use social networking.
Re: (Score:3, Interesting)
But, think of the implications for upcoming elections! How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones? Control! The people obviously need controlled, they don't know what is good for them and the Repubmocrats always will.
Since there is a market, Facebook, who covers most demographics, can help by raising tensions toward sinister interlopers in our one party political system.
Look for upcoming "hour of hate" shows profiling
Re: (Score:3)
How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones?
https://www.youtube.com/watch?... [youtube.com]
It's called 'First Past the Post', and Facebook has nothing to do with it.
Re: (Score:3)
How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones?
It's called 'First Past the Post', and Facebook has nothing to do with it.
You're right that "first past the post" is a big part of the problem. But that's far from the whole story.
For one thing, "Repubmocrats" do NOT have 100% of the power. It may be near 100%, but it isn't 100% -- for example currently [wikipedia.org] two U.S. senators, four major city mayors, and hundreds of state and local officials across the U.S. are elected independents or members of 3rd parties.
That means that quite a few voters across the U.S. have actual experience in ELECTING someone who is not a member of the tw
Re:One solution (Score:4, Insightful)
Mostly this, but I believe it could have been worded differently for clarity.
1. Like all institutions that are impacting on the public, Facebook and Twitter are being, and have been used, for propaganda. Guiding opinion happens from the time a kid enters Public school though to graduating from a University. We don't like to admit it, because it's frightening when you investigate the scope with which people are being indoctrinated. Reality can be a bummer, but it is reality.
2. Just like with Television, there is a massive amount of social engineering with online media and content. How many "News" programs want you to "follow us on Twitter" and "Like us on Facebook"? All of them, and you will hear and see that phrase repeated over and over.
3. Use of institutions like Education and Social media for purposes of "Social Engineering" is Propaganda in it's purest form and insidious. This is much worse than bread and circuses alone, because it constantly provides an opinion that someone want's you to have without any dialogue or discussion on the impact or morality of the opinion. I.E. Iraq war, Syria, candidates for public offices, etc...
4. The fact that propaganda wars are waged against citizens happen is not new. It's been documented back as far as I can recall. Certain people owned Newspapers and provide an opinion. If a counter opinion is provided, it's generally squashed by ad hominem and when that doesn't work it simply won't be discussed on any mass media. This has been transferred to Radio, TV, and now Social media. The scale at which it's currently done is massive compared to when I was a kid.
Investigating and learning about the psychology being used is like taking the red pill. A whole new world opens up to you, and you can see how much propaganda is being generated.
Re: (Score:2)
Re: (Score:2)
Lol ,G+ is ... an ANTI-social network.
G+ suits me fine, I have no friends IRL either.
You insensitive clod. People sometimes go on at me about being antisocial, I think they're being overly judgemental in implying it's a bad thing.
Re: (Score:2)
Why, I'd love to have you outside my circles!
Get on wit'cha bad self.
Re:One solution (Score:5, Insightful)
Within and without social media, people, events, results, and happenstance conspire to alter your mood each and every day... something that cannot happen without your tacit permission. Grow a thicker skin and remember that yelling at that jerk in traffic means you've allowed a complete stranger power over your behavior.
If giving up social media is too big a first step, don't go in with your eyes wide shut: you are the product, not the customer.
Re: (Score:2)
Re: (Score:3, Insightful)
Wow that was an impressive misspelling of "surgical".
Don't worry! (Score:2)
It's all perfectly harmless. CTos is here for you! [wikia.com] For those that haven't played the game, stop reading. With that one of the points in the game was "targeted reassignment" of vote predictions to get a mayor re-elected.
more interessting,.. (Score:5, Interesting)
it doesn't sound like this is the first experiment done by the facebook crowd -> What other experiments happened? Were the participants informed about it later? Who takes the blame if such an experiment results in someone getting hurt?
Re:more interessting,.. (Score:5, Interesting)
If Google was having a hard time deciding if a page was junk or not, would it be unethical to put it in the results for some users and see how they react? Clearly that's an experiment without user knowledge, but it certainly doesn't sound like it's unethical to me and stopping that kind of experimentation or flooding sites with notices about them would make things better for users.
Obviously there are experiments they could run that would be unethical if users weren't informed and monitored; discussing where the lines are and agreeing some best practices would therefore make sense.
Re: (Score:2)
I think it's fine (Score:3, Insightful)
I love how overblown the coverage of this has been..as if it's driven people to suicide. It's their site, they can do what they want; people are free to leave if they want. Nothing to see here.
Re: (Score:2)
I think they didn't go far enough, more experiments like this should be done. Never before has such a database been compiled (other than NSA). Much can be learned.
Re:I think it's fine (Score:4, Insightful)
More PUBLISHED experiments, though, please. Let's know what they're doing, and what the outcomes are.
Re:I think it's fine (Score:4, Insightful)
That is kinda my reaction as well. It seems this issue people have here is that facebook sought to manipulate peoples emotional state. The thing is that is exactly what just about every advertiser does all the time.
Home Security System ads: clearly designed to make you feel vulnerable and threatened.
Cosmetic surgery ads: clearly designed to make you feel inadequate.
Beer ads: very often designed to make you feel less accepted, you need their product to be preceived as cool, ditto for clothing, and personal care products
Political ads: feelings of security and family (at least if you pick their candidate)
This list goes on...
It might not have the same rigor as the academic world but they absolutely do focus group this stuff and find out how people 'feel' the marketers have researched what words, phrases, and imagery can best evoke these feelings. If what facebook did is illegal or even just unethical than so is pretty much everything the modern advertising industry has been up to for the past 70 years.
I am sure many people would actually agree with that, but I don't see why its suddenly so shocking and deserving of attention just because facespace does it.
This is not advertising (Score:4, Insightful)
The thing is that is exactly what just about every advertiser does all the time.
No it is NOT the same thing. The beer company does not have any control over what *I* say and they do not get to (legally) change what I say or how it is delivered to others. There is a HUGE difference between putting a message out there and seeing how people react to it versus actually changing what you or I say and how it is delivered to someone else without my consent. The former is advertising which is fine as long as it isn't too intrusive. The later is a violation of personal sovereignty unless you obtain informed consent beforehand.
Furthermore even if every advertiser actually did this (which they do not) and you have an ethical blind spot so large that you can't actually see what Facebook did wrong, two wrongs don't make a right. "Everyone else is doing it" is a juvenile argument that little kids make to justify behaviors that they shouldn't be engaging in.
Re:This is not advertising (Score:5, Insightful)
As far as I could tell from reading about this, they didn't change what people said.
Here's the thing, Facebook already filters what you see with the default setup. Your 500 friends each post 10 posts today, and when you load up your page on a social networking site, the page only displays 15. So how are those 15 chosen? (I'm making up numbers here, obviously)
The obvious choice would be to show the 15 most recent posts, but that means there's a good chance you'll miss posts that are important and that you'd like to see, since you're only getting a brief snapshot of what's going on in that social networking site. Facebook instead has an algorithm that tries to determine which of those 5,000 posts you'll care most about. I don't know the specifics, but it includes things like favoring the people who you interact with most on Facebook.
So what Facebook did in this study is they tweaked that algorithm to also favor posts that included negative words. The posts were still from that 5,000 post pool and the contents of the posts were unedited, but they subjected you to a different selection in order to conduct the research.
It's still an open question as to whether this sort of thing is appropriate, but it's important to note that this is something Facebook does all the time anyway. I think where is gets creepy is that Facebook is also an ad-driven company, so you have to wonder what the eventual goal of this research is. I can imagine Facebook favoring posts that include pictures of food to go along with an ad campaign for Seamless. Maybe they'll make a deal with pharmaceutical companies to adjust your feed to make you depressed, while at the same time plastering your feed with ads for antidepressants.
Communication is more than syntax (Score:2)
As far as I could tell from reading about this, they didn't change what people said.
Yes they did. There is more to communication than the specific words used. Tone, timing, delivery, emphasis, etc all are part of the message. If Facebook altered any of these to be different from the expectations of the user without informing them beforehand then they changed what people said. There is MUCH more to human communication than the syntax used.
It's still an open question as to whether this sort of thing is appropriate
I disagree. I don't think it is an open question at all. How Facebook did what they did is unacceptable. Doing experiments like this is fine in pri
Re: (Score:2)
Yes they did. There is more to communication than the specific words used. Tone, timing, delivery, emphasis, etc all are part of the message. If Facebook altered any of these to be different from the expectations of the user without informing them beforehand then they changed what people said. There is MUCH more to human communication than the syntax used.
I'm not sure how you think they changed the tone, timing, delivery, or emphasis of the messages. Apparently they used real posts and posted the entire content of each post without alteration. From what I understand, though I'm interpreting from a few different stories that I read, all they did was to alter the algorithm that Facebook already uses to choose which posts to show in your feed. They didn't insert or remove words from the posts. They didn't do anything to really re-contextualize them.
Whether
Re: (Score:2)
"Everyone else is doing it" is a juvenile argument that little kids make to justify behaviors that they shouldn't be engaging in.
Ugh, didn't you disgust yourself while typing that out? There was a lot more to OP's argument than that, as you very well know. And that's on top of the Olympian leap it must have taken to claim that a private company tweaking the information filtering algorithms for their entirely optional leisure service can constitute a violation of personal sovereignty, which is a concept more commonly reserved for discussions on issues like indentured servitude...
Re: (Score:3)
Re: (Score:2)
Advertising is not always identified. How often have you gotten a letter, that is designed to look like an insurance invoice or bank check. Many look official enough I have spend at least 20 seconds deciding if its something I really need to act upon. All kinds of advertisers take out full page ads and do their damnedest to disguise them as articles in print magazines and news papers.
Sure there is always some fine print somewhere that says "advertisement" but then I would argue facebooks EULA qualifies a
Re: (Score:2)
Re:I think it's fine (Score:5, Insightful)
Facebook uses psychology to make minor changes in our happiness... Something must be done!
Soda companies use psychology to sell huge buckets of sugar water... Hands off our soda, Mayor Bloomberg!
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
But Facebook has never provided 100% of all posts of all friends/non-friends/acquaintances in your feed. There's a wide variety of stuff there and they pick and choose. I don't use facebook so I don't know all of what's there, but on G+ the list of "what's hot" clearly doesn't include everything, if I reload the page I get a different list of items appearing, and sometimes I do have to go directly to a friend's page to see some recent posts.
There's no evidence given here that Facebook decided not to show
Re: I think it's fine (Score:2)
That's what I thought when I read it too. I wonder if Slashdot did an a/b test with its moderation system and did some sentiment analysis on the resulting comments, would there be the same outrage?
Re: (Score:2)
I love how overblown the coverage of this has been..as if it's driven people to suicide.
Worked beautifully, hasn't it? It's all about Facebook letting stockholders and advertisers what they are doing to improve the value of the PRODUCT (a.k.a. "users") to maximize revenue. Outrage from the product only serves to prove its effectiveness.
Shock and awe (Score:3, Informative)
Re: (Score:3)
This is probably the most important consideration for people to understand. Do you really think that the people pulling the strings give a rats ass about individuals? To people like Zuckerberg, you are something to be exploited for whatever purpose they wish.
So now that we know that they have done this once, I simply wonder how many more experiments they have done without disclosure. I have a hard time believing this is the first, or the last.
As a first guess, the whole "timeline" feature that was forced
Holy False Dichotomy; Batman! (Score:2)
In a way, though, the fact that it doesn't go even further too far helps make it pathetic: what happens on 'social' services are the ethical transgressions of our best and brightest, equipped with nigh-unlimited funds, the assurance that they are Just That Good, and that 'disruption' is the ultimate virtue, and yet their imaginations seem to extend no further than bei
A/B-Testing (Score:5, Insightful)
I understand why this should be considered wrong and fully understand users who don't want to have someone (less some company!) playing with their feelings.
But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing? 50% of your website users will see a slightly altered version of your website, and you compare response rates to the users receiving the "old" or "original" website.
Advertisers are manipulating our feelings for decades.News outlets have been doing it to an extent it became part of the news format itself (I guess anyone who was watching tv news last night saw that light-hearted, cozy, human-intrest or slightly oddball or cute item concluding the broadcast, right?) While creating negative feelings toward someone else has always been used in political campaigns.
It even becomes less spectacular if you consider, that on facebook, there always has been a selection algorithm in place, that tried to select those items from all your facebook-sources, that might keep your intrest focused onto facebook. Without selection, your facebook would scroll past like the Star Wars end titles. Only the parameters of the selection have been fine tuned, as they probably are at each facebook server update. It would be some new quality if that selection had been "objective" before, but being "personal" and emotional instead, is what kept us at facebook already.
So this is old news. But it should be a wake-up call: WAKE UP, THIS IS OLD NEWS! PEOPLE ARE TRYING TO MANIPULATE YOUR FEELINGS FOR AGES!
Just in case you haven't noticed. I'm surprised about the number of people who are surprised.
Re: (Score:2)
It becomes a problem when you involve actual academic research staff. Private companies can do whatever the heck they like outside of a hospital, but researchers engaging in interventions are required to meet certain ethical standards as a matter of professional norm and quite often as a binding condition of any funding they have received.
Messaging versus manipulation of content (Score:3)
But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing?
Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line. If you are a beer company, you can try to promote your product to me in a way that you think might make me more inclined to buy it and that is fine as long as you aren't overly intrusive about it (think telemarketers). What is NOT fine is for them to take what I say and manipulate that to try to convince me
Re:Messaging versus manipulation of content (Score:4, Interesting)
Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line.
But according to /., not what happend here.
According to this article here [slashdot.org], no messages were changed:
Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone
(emphasis mine).
I agree with you that changing the actual messages would not be acceptable by any standard.
Just in case you haven't noticed. I'm surprised about the number of people who are surprised.
Then you do not understand what is going on. Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable. Screwing around with people's emotions in a controlled experiment should require at minimum review by a genuinely independent ethical review board and probably genuine informed consent. Facebook could be bothered with neither one. They seem to regard their users as insects to be manipulated and dissected.
And again I agree with you that you're stepping over a line when you're consciously manipulating people's feelings for economic reasons. But this line is crossed thousandfold already. The type of environment is secondary. A/B-tests take place in controlled environments, too.
Tone and delivery are part of the message (Score:2)
According to this article here [slashdot.org], no messages were changed:
If ANYTHING about the message is altered including delivery schedule, mix of content, etc then they are altering the message. Not everything about a message is the simple content. When you send a message and the tone you use is every bit as important to correct interpretation by the recipient. Facebook altered the messages without actually changing the specific content. If the message was unaltered (including delivery, tone, timing, etc) then we would expect reactions to be identical.
But this line is crossed thousandfold already.
Even if true (which
Re: (Score:2)
According to this article here [slashdot.org], no messages were changed:
If ANYTHING about the message is altered including delivery schedule, mix of content, etc then they are altering the message.
Please define "message".
It may refer to an item in your facebook stream. In which case, nothing in the messages has been altered.
Or "message" may refer to the the facebook stream as a whole, made up of the smaller individual message items by your friends and/or advertisers.
In that case, facebook is the sender of the message and the "message" always has been subject to facebook picking news items. We basically had more than one algorithm (or parameter sets for the same alogrithm) that picked those messages.
Re: (Score:2)
But this line is crossed thousandfold already. The type of environment is secondary.
So your point is everybody's doing it, so it's OK?
That excuse didn't fly with anyone I knew growing up. I guess folks like you have forgotten some basic childhood lessons.
Re: (Score:2)
Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable.
Yes actually it does make it acceptable because the people doing the experiment knew that it was very very unlikely to cause anyone serious injury. When an psychological experiment amounts to no more than making people aware their buddies had a shitty day at work by their own account, I don't think it actually rises to the level of requiring consent.
Humans conduct experiments all the time; its how any self aware being interacts with the world around them. It just on a small scale so nobody cares, I bet pl
Re: (Score:3)
The issue is clear; if a doctor or psychologist tried this, they would have to get IRB approval. You need informed consent; such laws were passed after psychologists had tried a LOT of experiments on the unwitting public; simluating muggings, imminent death scenarios, etc.
I know people say "it's just manipulating feeds, what's the harm?" There can be plenty of harm if you manipulate the feeds. Where is the line? What if facebook had decided to see what happens if you try showing depressing posts and bad new
Re: (Score:2)
The issue is clear; if a doctor or psychologist tried this, they would have to get IRB approval. You need informed consent; such laws were passed after psychologists had tried a LOT of experiments on the unwitting public; simluating muggings, imminent death scenarios, etc.
Yes. And I agree with you.
I never said it was or should be accepted, I said it was widespread. And that in marketing, emotional manipulation is even out of the experimental stage.
Re:A/B-Testing (Score:4, Interesting)
Apparently they did actually get IRB approval, oddly enough. The study was jointly done with two universities, and from what other researchers have told me, the two universities' IRBs approved the protocol. I'm surprised myself that they would. Would be curious to see what their reasoning was.
Re: (Score:2)
Now that IS very interesting. I wonder how the IRB approved an experiment that clearly didn't have any participants' consent.
Re: (Score:3)
The reasoning was the same as what many are saying (IMO, incorrectly) here - that FB was already manipulating feeds so it was OK. I find this reasoning specious because, normally, FB modifies what it shows to attempt to change a narrow behavior with relatively finite consequences - whether a user clicks on an ad or not - while with this experiment the researchers were trying to alter something much more broad - a person's entire emotional state - a change with much broader implications. Given what we know f
Advertising =/= scientific research (Score:2)
It's different from A/B testing in that the experiment is explicitly designed to cause harm to half of the participants.
Presumably most A/B testing would be designed to figure out which choice performs better on a set of metrics. But going in, there is little evidence to point to one or the other, and the "harm" caused would simply be in user experience. In this experiment, the researchers had a prior theory about which choice would cause harm, and the harm is emotional and psychological.
All that aside, if
Re: (Score:2)
Agreed.
But intresting enough, according to one of those news articles I read about that issue today, one of the potential harm that was supposed to be subject of the experiment was feeling left out by too many positive news about their friends.(*)
May be BS, but may indeed be a valid and intresting theory, too.
(*) That statement should have at least 6 pairs of "quotes" around certain "words". I left them out for readability.
Re:A/B-Testing (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
so all kind of products trying to claim they're healthy while they're absolutely not isn't happening, and never has happened?
under which rock have you been living?
Re: (Score:2)
e-Cigarettes, for example.
Re: (Score:2)
First, no it's not, nice try.
At the very least, the majority of advertising is aiming to make people buy things that they don't need. Beyond that, it's often stuff that's unhealthy or inferior to alternatives available at a lower price.
Second, people are aware that it is marketing/advertising
No they're not. For example, count the number of adverts that you're aware of in a film some time. Then look up how many careful product placements there are. See also, paid product reviews, social network endorsements, and so on. Most people are aware of a small fraction of the marketing targeted a
Re: (Score:2)
Too far? (Score:5, Insightful)
What about what advertisers do every day?
Our government (for us Americans) runs campaigns to alter opinions in other countries.
I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.
Beer companies anyone?
Messaging versus content manipulation (Score:2)
What about what advertisers do every day?
What about them? They don't get to run controlled experiments on me and they certainly do not get to alter what I say or how others receive what I say. Advertisers can control what they say to me and see how I react but they don't get to manipulate what I say and see how that affects others. HUGE difference.
Our government (for us Americans) runs campaigns to alter opinions in other countries.
They don't get to adjust what *I* say to see what effect it has on others. You really can't see the difference?
I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.
When they are performing a controlled experiment on me then yes they should. If they w
Don't read if you don't want your emotions changed (Score:2)
"If you are exposing people to something that causes changes in psychological status, that's experimentation,"
No it isn't, otherwise the above sentence would be experimentation, as it changed my psychological state from calm to annoyed. Is it too much to ask that supposed experts use their own jargon correctly?
"Victims" received positive or negative newsfeeds? (Score:4, Interesting)
The impetus for the study was an age-old complaint of some Facebook users: That going on Facebook and seeing all the great and wonderful things other people are doing makes people feel bad about their own lives.
So although conventional wisdom might say that seeing positive things makes you happier, here there have been accusations to the contrary -- positive things about other people makes you feel lousy about yourself. This study ostensibly looked at that (and I think it found something along the lines of conventional wisdom: happy posts make you post happy stuff, a [dubious!] proxy for your own happines...).
If Facebook knew (and how would they?) that X makes you depressed, then yes...there might be some moral issues with that. But it seems that Facebook asked a legitimate question -- especially so given that it was published in PNAS.
That said, yeah...it feels a little shady. But then, when I log onto Facebook, I am certainly not expecting any aspect of the website to be designed with my best interests in mind!
Advertising? (Score:2)
Advertising frequently uses psychological pressure (for example, appealing to feelings of inadequacy) on the intended consumer, which may be harmful.
Criticism of advertising [wikipedia.org]
...was my 1st thought when reading...
"If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent."
One could argue that advertising is not always done with informed consent.
Natural vs randomized experiments (Score:4)
Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much...
If this guy actually thinks nobody complains about this much then he isn't paying attention. However putting that aside his argument is a straw man. There is a VERY significant difference between changing a service and that change having an emotional impact versus actually experimenting on the emotions of your customers directly and without their permission without even so much as review by an independent review board. Anyone who can't comprehend the difference between the two has a pretty big ethical blind spot. The fact that Facebook seems to be genuinely surprised by this response tells me everything I need to know about how they regard their users. They see them the same way an entomologist sees bugs - something to be cataloged and experimented on but not worthy of the respect one normally gives other human beings.
–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more
There is a big and fairly bright line between observing users behavior given certain stimuli as a natural experiment [wikipedia.org] and the experimental investigators manipulating those users directly without their permission in a designed experiment. The later generally requires informed consent [wikipedia.org] for a variety of very sensible reasons relating to ethics. The fact that emotional manipulation is done in other contexts is utterly irrelevant. That's the same argument children make when they claim that "...but all my friends are doing it too". I suppose since Facebook is owned and run by an immature child billionaire that I shouldn't be surprised.
And no, the Facebook terms of use does NOT rise to the level of informed consent.
Re: (Score:2)
The fact that Facebook seems to be genuinely surprised by this response tells me everything I need to know about how they regard their users. They see them the same way an entomologist sees bugs - something to be cataloged and experimented on but not worthy of the respect one normally gives other human beings.
And how can this surprise you? Have you ever heard anything at all about Facebook respecting the privacy of its users? In fact, again and again and again Facebook ends up in the news with an anti-privacy scandal on its hands.
I am not saying that running social experiments on random people is a great idea (though it is funny), I am saying this is a 'no biggie' because it is neither surprising nor out of line with previous actions. That doesn't make it right, but anyone with half a brain should have seen it c
Unethical (Score:2)
I'm a postdoc at university, though not in a field in which you usually study human behavior. Anyway, if I experminted on humans without their prior consent, I'd loose my job. In every application for a project that involves studies on animals or humans there is an ethics form to fill out, and I must wonder how they got funding without cheating in one of those forms.
Lying to tests subjects is to some extent necessary, of course, or otherwise research in pschology would be almost impossible. However, conduct
Re: (Score:2)
I agree, this was my first thought. They screwed up big time, it would be fun to see the federal government investigate them for unlicensed human research.
Re: (Score:2)
They did receive IRB approval, however the protocol listed in the paper expressly breaches one of the IRBs' rules, and may breach several others depending on how the study was performed. It shouldn't have been approved.
Yarkoni misses the point (Score:3)
Re: (Score:2)
without their consent
What's actually more problematic to me is that the paper explicitly claimed they asked for and received "informed consent". But their justification is that users agreed to the Facebook EULA. That is a serious misunderstanding of what constitutes informed consent in research ethics; it does not just mean that someone agreed to some fine print, possibly months ago, in a transaction unrelated to the current study.
If they want to argue that this doesn't require informed consent at all, beca
Re: (Score:3)
without their consent
What's actually more problematic to me is that the paper explicitly claimed they asked for and received "informed consent". But their justification is that users agreed to the Facebook EULA. That is a serious misunderstanding of what constitutes informed consent in research ethics; it does not just mean that someone agreed to some fine print, possibly months ago, in a transaction unrelated to the current study.
If they want to argue that this doesn't require informed consent at all, because it's e.g. just data mining of effectively existing data, that would be less problematic imo than watering down the standard for informed consent to include EULAs.
I agree, with an added thought. It wasn't just data mining but a controlled experiment that altered the data they received. That, IMHO, cross the line between "let's look at the existing data" to "let's conduct an experiment."
Another part to his argument seem to be "the impact was so small as too be negligible and thus it was OK." However, the researchers did not know the results would be negligible so using that as an excuse after the fact doesn't fly.
Re: (Score:2)
Facebook didn't simply set out to make tweaks and see how users responded; they setup a controlled experiment on subjects without their consent; a practice that appears to violate ethical and possibly legal guidelines for behavioral research.
Bingo. Advertisers may do this sort of thing all the time, but they don't get it published in peer-reviewed scientific journals without adhering to standard human research protocols. PNAS should immediately retract the article, and the researchers involved should be censured and stripped of funding.
And people who don't want to be experimented on without consent should just fucking quit using Facebook.
Slow news day . . . (Score:2)
Creating emotional response is not the issue (Score:5, Interesting)
The problem is not that they attempted to create an emotional response or manipulate people's emotions. As people are constantly pointing out advertisers so that all the time. People don't seem to grasp that there is a large difference between this and advertising.
The problem is the way it was done. People use facebook with the expectation that they are seeing a (reasonably) objective representation of what their friends are trying to express or convey. Facebook is the equivalent of the telephone in a telephone call. If the telephone somehow manipulated what you heard to make your friend sound more negative or positive without changing their core meaning that would be unethical without informed consent, just as this is.
A more extreme version would be facebook subtly modifying the content of what your friends post as it appears to you without anyone knowing it was doing this. That would be even more unethical. The problem is mirepresentation, the method by which they attempt to manipulate emotions.
Re: (Score:2)
People use facebook with the expectation that they are seeing a (reasonably) objective representation of what their friends are trying to express or convey. Facebook is the equivalent of the telephone in a telephone call.
That claim would make sense if people commonly held telephone conversations with hundreds of people simultaneously who say things continuously all day long. There are plenty of forums on the Internet that display information based on simple rules like "most recent post at the top". But as long as you, your family, and just about everyone else in the country are using Facebook instead of one of those forums, then the only thing you're complaining about regarding this story is that they, for once, decided to
Outrage due to Censorship, not the test (Score:4, Interesting)
I talked to several (non-tech) friends about this, and they were more upset about Facebook "censoring" out posts than the emotional manipulation. In their minds, Facebook allows everything to be shown, but certain topics gain preference due to likes or dislikes. However, they will show you everything if you scroll far enough.
Their outrage came from the thought that FB was removing "happy" content from their feed. (That it was no longer a "dumb" pipe for social data).
Consent is required (Score:2)
They conducted a psychology experiment without the consent of the test subjects. I'm not sure what the rules are for private organizations, but I do believe that any publicly funded researcher involved in the experiment, or possibly those who use the results, would be at risk of losing all federal funding. I really hope some lawsuits are filed against facebook and any of the researchers because this shouldn't creep in to becoming an accepted norm.
Isn't the FB Newsfeed a giant experiment anyway? (Score:3)
I quit using Facebook six months ago, but for a couple of years was a regular user.
The "newsfeed" always struck me as enormously manipulated, with Facebook constantly altering the algorithm that determines what you're shown. Even nontechnical users would comment about this, wondering why they didn't see some posts from some people some times.
Some of this may have been benign, trying to figure out what order to display posts relative to relationships, posting frequency, sort of ordinary attempts to sort out "importance".
But I'm sure there was commercial manipulation -- ranking user comments with links to advertising-affiliated sites higher than non-affiliated sites, downranking links to sites likely to lead a person to shorten their Facebook session, etc.
All of this could be considered "manipulation" even though there might not be one single motivation behind it and not all the factors may be even focused on a specific outcome.
Re: (Score:2)
The responses we see here are less nerd-like and more political.
Re: (Score:2)
The comments on these articles is much ado about nothing.
Nonsense. I quit facebook for the same reason, but this is still substantively different. This was deliberate manipulation of mood solely for the purpose of study. Granted, what Facebook normally does is also horrible, maybe even moreso. After all, if the purpose is either to sell you more shit you don't need, or to manipulate political speech, either way they're being downright evil.
Human experimentation needs close supervision (Score:4, Insightful)
There is no fine line here. There's only a bold one. Does it involve humans? If so, not only is tight ethic supervision required (to avoid a Milgram scenario) but, and that's the even more important part, the active and willing consent of the participating people is required.
Anything else, no matter how "trivial" it may be seen, is simply and plainly wrong. And no, some clause somewhere hidden between another few billion lines of legalese in an EULA does NOT constitute consent to being a guinea pig!
That wasn't my experience (Score:2)
Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more.
Back when I was on Facebook, it seemed like every change they made was designed to make me want to use its product less. So much so that I eventually asked them to delete my account.
Re: (Score:2)
Back when I was on Facebook, it seemed like every change they made was designed to make me want to use its product less. So much so that I eventually asked them to delete my account.
I didn't think that was "designed". I thought it was just their ineptness.
Google and Media outlets too (Score:2, Insightful)
I read Google News because it gives several different media outlets' spin on the same story. But you need to be aware of which sites are listed and seek out coverage from the other side. They tend to give higher weight to liberal leaning media when the story is a topic liberals are more focused on. For example, the three outlets on today's SCOTUS decision against labor unions are USAToday, LA Times, and NBC, with a Huff Post opinion piece right under that list. One can assume that's just an artifact of Goog
No problem... (Score:2)
That explains it... (Score:2)
Re:This news piece has been greatly exagerated (Score:5, Insightful)
Bullshit. How do you know that you don't know anyone that was affected by it? Do you know which week in 2012 the experiment was conducted? Do you know which of the ~billion FB accounts were the 700k experimented upon? I find it pretty shocking that so many people are having difficulty understanding the difference between A/B testing and intentional emotional manipulation where a significant negative (or positive) result was the data point the study strove to measure.
I can quite imagine that a significant number of offline lives were impacted by this experiment. People exposed to negative content presumably don't limit their negative reactions to behavior only in the venue where they were exposed to the negative content.
Re:This news piece has been greatly exagerated (Score:4, Interesting)
I find it pretty shocking that so many people are having difficulty understanding the difference between A/B testing and intentional emotional manipulation where a significant negative (or positive) result was the data point the study strove to measure.
Creating an emotional response is part of marketing and therefore webdesign.
Of course you're not directly monitoring emotions as a data point during A/B-Tests. You measure e.g. the clicks, pages read or the time spent on the website. But every marketing guy worth its money could tell you that you can increase all of that by "making the user feel at home".
Re: (Score:2)
Re: (Score:3)
So you don't want to create websites that people enjoy using?
That may explain the design of the average Linux user group website, but wold also explains why websites like facebook or even lolcats, that target emotions, have more commercial success.
Re: (Score:2)
Web design, much like beauty, is in the eye of the beholder. But more importantly, web design is not about illiciting an emotional response from a user, it's about navigation. How easy it is for people to navigate your site. But if you want to get caught up in putting dancing bears on your web page so people can get a warm and fuzzy feeling, whatever.
Re: (Score:2)
Point taken.
I'll reduce that claim to "commercial web design". But that's still the majority of pages out there. They want to SELL. And if it takes those dancing bears, there is no way they won't use dancing bears.
Quick: what toilet paper brand has dancing bears as mascots?
And aren't they cute and funny and loveable.... See, it works.
Re: (Score:3)
All advertising is about manipulating emotional responses. And advertising has always involved experimentation, they just don't call it that normally.
Re:This news piece has been greatly exagerated (Score:4, Insightful)
1. Look up what wacky crimes were committed in January 2012.
2. Blame them on Facebook.
3. Sue.
4. Profit!.
In January 2012 a bunch of kids formed the Islamic Caliphate of the Rusted Chevy on Cinder Blocks on my front lawn, despite their parents instructing them: "You best be staying away from Mr. Kid, he ain't right in the head. "
Obviously Facebook manipulation caused this.
Re:This news piece has been greatly exagerated (Score:5, Insightful)
Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.
Do you PERSONALLY know anyone who was affected by warrantless wire tapping? Neither do I.
As long as they never admit who it happened to, so that nobody can know whether it happened to them, then we're good? Look, there probably isn't anyone alive today (and certainly not on thus website) who knew Little Albert but that doesn't make the experiment that was done to him any less unethical.
Facebook's TOS can obtain consent, but it can never obtain informed consent.
Re: (Score:3)
Which is exactly what broadcasters do. They want to know if they should show the car crash story of the story with the kitten stuck in the tree, and they want to know which will grab the viewer's attention and increase ratings, etc. So they actually do the research, monitoring viewership over time as the channel's style changes, though often this is done by aggregating word of mouth from other stations, following the trade magazines, etc. Facebook was just being more efficient in this regard.
Not exaggerated at all (Score:4, Insightful)
Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.
Nobody knows who was affected or exactly how. That's part of the problem. They did it without knowledge or consent. They did not inform people of what they were doing or the fact that they did it after the fact. They did not have their design of experiment reviewed by an independent ethics board. They violated the (misplaced) trust their users had to deliver their messages as the users intended.
This isn't legal documents we're talking about here, anyway. I'm also pretty sure this is covered under Facebook's EULA/TOS you didn't read.
NOTHING in Facebook's TOS remotely qualifies as informed consent to be experimented upon. I don't even have to read it to know that. It's not THAT they did this experiment, it is HOW they did this experiment. It's not hard to check the experiment proposal in front of an ethics panel. It's not hard to get informed consent if that is deemed appropriate by the ethics panel. It is standard practice to do those things for some very very good reasons. Facebook couldn't be bothered.
Re: (Score:3)
I don't know anyone who was affected by the Tuskeegee syphilis study, but that doesn't mean it was right or we shouldn't be outraged.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Failbook has always proved and will always prove to be intrusive. Yet the sheep that use failbook continue to prove they are nothing more than stuipid little fucks that value nothing at all. Now with this "emotion experiment" the dumb asspie cracker Zuckerberg feels he is beyond any and all laws with his sheep still saying "fuck me in the ass harder Mark." The solution to this simple, shut failbook down. If you must keep in touch that is what email and *gasp* letters via snail fucking mail is for. Then there are also a new fangdangled method called a "website" that will allow for someone to put their shit up. Making a webpage is all too simple. If they can't make one then they are too fucking stupid to even exist let alone use a fucking computer so it is best to let the fucktarded sheeple that use failbook to fucking self destruct and perhaps earn themselves a fucking darwin award along the way.
I dare say I smell the distinct aroma of a Pulitzer from your florid loquaciousness.