In 2012, Facebook Altered Content To Tweak Readers' Emotions 130
The Atlantic reports that two years ago, Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone, negative or positive, and observe the results. From the Atlantic article: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
At least they showed their work.
consent (Score:5, Interesting)
There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...
Re:consent (Score:5, Insightful)
[citation needed]. Almost every major website does A/B_testing [wikipedia.org]. Is there a law againt this? (That's not a rethorical question. I actually would like to know.)
Re: (Score:3, Insightful)
I don't think this is about whether Facebook had a legal right to do this, but more on that in a minute. It's more about whether it was ethical on their part. Regardless, I think it clearly was not ethical for the researchers to do this study without getting the approval of the users who took part in the study.
Getting back to the legal issue, every site or app has a Terms of Services agreement. Does FB's TOS say that you might be randomly placed in a A/B test used for academic research purposes? If they don
Re: (Score:3, Interesting)
That said, I do find this whole thing to be a little shady...but I'm not sure it's a particularly rational reaction, given that I rarely care about A/B te
Re: (Score:2)
Re: (Score:3)
If they don't, it seems to me that could be a legal issue.
I doubt it. First, there is no "law" requiring informed consent for such an innocuous study, just ethical guidelines. Second, there is no law saying that entities can only do things positively asserted in the TOS. Companies do behavior research all the time. Grocery stores experiment with different product placement, different background music, different lighting. They are not expected to get consent for that. This is no different. I am feeling a distinct lack of outrage about this.
Re:consent (Score:5, Interesting)
There are laws against assault, bullying, and so on. The positive spin in innocuous but the negative spin is not.
With 700 000 potential victims, the numbers are against them because when your sample size is that large outliers are the rule and not the exception.
The risk of copycat suicide for example should have been obvious to those conducting this study.
Re: (Score:1)
Re: (Score:3, Insightful)
Re: (Score:2)
I was addressing whether or not there could be legal issues, so the subject of intent becomes very relevant.
The intent, for half the people affected, was to cause harm.
Re: (Score:2)
Re: (Score:2)
In conducting the experiment, they among other things caused harm through inducing depression which persisted even after the experiment. Causing harm was part of the intent even if not the ultimate intent.
Your argument was: the means justify the end.
Re: (Score:2)
Apparently the psychologist involved thought is was OK, because as it is part of normal default policy, âoeFacebook apparently manipulates peopleâ(TM)s News Feeds all of the timeâ. https://theconversation.com/sh... [theconversation.com].
SHIT THAT'S A REAL SHOCKER deserving of capitals.
Re: (Score:2)
Does FB's TOS say that you might be randomly placed in a A/B test used for academic research purposes? .
Actually, it does.
Re: (Score:2, Interesting)
Re:consent (Score:4, Informative)
From a legal standpoint, for an activity to be considered "research", it must be "designed to develop or contribute to generalizable knowledge". http://www.virginia.edu/vpr/ir... [virginia.edu]
When a website uses A/B testing to improve its own internal operations, it's seeking to privately develop limited knowledge on its own operations, rather than general knowledge. This puts it outside the scope of US federal regulations on research, which have been narrowly crafted to avoid regulating commercial activities like these.
Given these criteria, Facebook was surely engaged in research.
Re: (Score:2)
I really doubt Facebook conducted this experiment in order to "develop or contribute to generalizable knowledge" out of the kindness of their hearts. Far more likely, they seek to profit from it by generating more posts, more traffic, etc.
This is no different from retail stores conducting studies linking customer spending habits to item locations; they do such studies all the time.
Re:consent (Score:5, Informative)
Yes there is laws against this. Anyone who lives in Canada, and is a part of the experiment but did not receive informed consent may contact Health Canada/federal crown about it. It's illegal here. [hc-sc.gc.ca]
Re: (Score:1)
Re: (Score:2)
Just think of it as the new Facebook beta.
That's pretty damn vague, son. (Score:1, Insightful)
What exactly is considered a "psychological experiment"? Your definition seems very vague, and implies that any sort of software usability testing or change that involves offering different experiences to different users should be outlawed or very strictly controlled.
Take Mozilla Firefox as a recent example. Firefox 28 had a shitty, but at least partially usable user interface. Then Mozilla released Firefox 29, which brought in the Australis user interface, which is indisputably a pretty much unusable pile
Re: (Score:1)
Well, I dunno, but I think being published in PNAS probably points towards it being considered a psychological experiment and not software usability testing.
Re: (Score:3)
Most of this 'free' software has a "do you accept the terms and conditions" clause that you have to click "I agree" in order to install or run the software.
Now the enforcability of such agreements may be open to dispute, especcially if it entailled some sacrifice on the users part (you agree to give up your firstborn child to us..) but it usually would cover "we can change the user interface at any time, and don't complain if memory leaks cause your system to crash eventually)
Filter bubble (Score:5, Interesting)
What actually disturbs me more is: why should they do this? The answer is simple: They want to determine the most effective non-obvious way of creating filter bubbles to make the user feel well and stay longer.
It is so-to say a "second order filter bubble", i.e. the use of a positive feedback mechanism.
Re: (Score:2)
They want to determine the most effective non-obvious way of creating filter bubbles to make the user feel well and stay longer.
Well, they want the user to stay longer. But making them feel well wouldn't actually do that. Depressed people use the internet more. If it makes you depressed, then bingo! Ad revenue!
Re: (Score:2)
Rather than whine about being a guinea pig you could just not play their game. Every hour spent on Facebook is an hour of your life lost to more productive or enriching pursuits.
It's called the Common Rule (Score:5, Interesting)
It's called the Common Rule [hhs.gov], although it generally only applies to federally funded research. There is some evidence [cornell.edu] that this study was in part federally funded. I think there are serious questions about whether a click-through agreement meets the standards of informed consent.
Although the study was approved by an institutional review board, I'm surprised, and the comment from the Princeton editor makes me wonder how well they understood the research design (or how clearly it was explained to them). This would never have gotten past my IRB.
Re: (Score:3)
What specifically does the Data Use Policy say about this? The bit I saw quoted was that users agreed to Facebook's "internal operations", with research being an example of those. Peer-reviewed publication in a journal is clearly not an internal operation.
Re: (Score:2)
That's not informed consent as it would be deemed by any research institution or court of law. Informed consent requires a discussion with the subject on the nature of the research, its purpose, the manner in which data will be collected and used, and an explicit agreement from the user. What Facebook thinks it has is implied consent - which they frankly don't have either.
This study is just plain unlawful.
Re: (Score:2, Interesting)
There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...
There is no such law. In any case, this is the the basis for the entire news business. Why do they report murders but not acts of charity (unless a celebrity is involved)? It is all about getting you to watch and see ads. Facebook is doing nothing that TV news hasn't been doing for years.
Re: (Score:2)
There is no such law. In any case, this is the the basis for the entire news business.
While it may not be an actual law, there are strict rules [hhs.gov] about this for any study, like this one, that receives US federal funding.
Re: (Score:3)
There are laws governing obtaining informed consent from humans before performing psychological experiments on them.
That only applies to federally funded research (which means almost all colleges and universities). Attempting to apply this to the private sector would raise serious First Amendment questions. What one person calls "psychological experiments", another might call "protected free speech".
Re: (Score:2)
That only applies to federally funded research (which means almost all colleges and universities). Attempting to apply this to the private sector would raise serious First Amendment questions. What one person calls "psychological experiments", another might call "protected free speech".
This study appears to have taken federal funds...
Re: (Score:3)
Ethics board must be a bunch of chimps. It's not Little Albert, but then again, it's on such a massive scale with 700k users. What if someone committed suicide because this manipulation?
Re: (Score:2, Insightful)
"There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting..."
They can do whatever they want, it's their site. They decide what to show you, if they show you something and when.
And they have the right to sell that information about you at any price they choose.
It's their business.
You are the product, the psychologists are the customer.
Re:consent (Score:4, Insightful)
They can do whatever they want, it's their site.
Did you think about that before you wrote it? If not, take a second and think about it.
There are many, many, many things they cannot do with their site.
Re: (Score:1)
They can do whatever they want, it's their site.
Did you think about that before you wrote it? If not, take a second and think about it.
There are many, many, many things they cannot do with their site.
Within technical limitations, they can do anything they want with their site. However, some things they could do may have legal or financial consequences.
Re: (Score:3)
That's the first thing that popped into my mind. After having spent many hours over the past week helping my daughter do paperwork so that she could submit her extremely benign science fair project to the county science fair's institutional review board, I'm wondering how FB can get way with this? I guess that they can get away with it because no one will call them out on it, unless some victims file a lawsuit.
That's the modern world -- a 15 year old kid doing something demonstrably harmless has to do hou
Re: (Score:2)
It's also not just a legal matter. Performing experiments on humans without their consent is immoral.
Re: (Score:2)
It's also not just a legal matter. Performing experiments on humans without their consent is immoral.
So is it immoral when walmart puts poptarts on the endcap or different colored bubblegum in the impulse aisle?
They are trying to see if they can manipulate you into buying more product. Stores even experiment in different colors.
Some stores want you to stay longer so they use happy colors. Other stores want you to leave quickly to make room
for the next customer so they use sad colors. They are altering the moods of their customers too and again, without
consent. I don't know what the threshold should be
Re: (Score:2)
Why I don't have a Facebook account (Score:5, Insightful)
This sort of thing is exactly why I have never signed up for an account. The lack of a moral compass at this company is profound.
Re: Why I don't have a Facebook account (Score:1)
Your positive comment about Facebook may have been due to psychological coercion. So, how may we take it as a true sentiment from you?
Re: (Score:2)
"It's also quite useful when applying for jobs, because nothing says "social outcast" like not having a Facebook account."
Socializing without Facebook (Score:2)
Facebook is profoundly useful though, as a messaging service that everyone uses and to keep abreast of things happening in friends' lives in a central, easy-to-access location. It's also quite useful when applying for jobs, because nothing says "social outcast" like not having a Facebook account.
Facebook is profoundly useful though, as a messaging service that everyone uses
I assure you that not "everyone" uses Facebook to communicate, including the majority of my social circle. Everyone I would actually communicate via Facebook to I can reach via some combination of email, phone, text messaging, instant messaging, US mail, fax, video conference etc. Not to mention actually meeting them in person. If you really need Facebook to stay in touch then you really aren't that close to begin with.
...nothing says "social outcast" like not having a Facebook account.
If you think Facebook is required to be in with the "cool" crowd then you need to ser
Re: (Score:2)
Re: (Score:2)
Not being in touch with what your friends are saying makes you more likely to be manipulated by mainstream media for lack of having as sizeable/technically-augmented of an alternative.
... that is, unless you ignore both...
Re: (Score:2)
Re: (Score:2)
Re:Why I don't have a Facebook account (Score:5, Insightful)
it seems its pretty much the same for every other large US company too.
Re: (Score:2)
It's been easy to think of people who got involved in Facebook as lemmings, apparently guinea pig was the more apt mammal to choose.
Ethical Responsibility (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
to protect human subjects
Oh do stop being so precious. It's no different from an individual posting a sad or depressed piece, themselves. Should they then be sued, arrested or punished for the "emotional damage" they cause to anyone who reads it?
Re: (Score:2)
Facebook deliberately did it, to see the effects. Manipulating people is never ethically right.
And yet there are individuals who do exactly the same thing every day. I would suggest that there are also organisations that make a positive decision to post content to change the emotions of their readers: whether to make them happy (and possibly tie that happy feeling to the website's message - religious, political, cultural), or angry or apathetic.
Just like every advertisement we see is designed to manipulate our emotions, websites do it all the time for gain, so to have FB do the same is neither new
National Academy is for junk science (Score:2)
I am rather surprised that the National Academy published the results of a study which violated multiple ethical guidelines put in place to protect human subjects.
The only real point of being accepted to the National Academy is access to Proceedings of the National Academy of Sciences. They don't turn down anything from members.
For NA members, it's where you publish when nobody else will accept your paper.
Facebook encourages posing. (Score:5, Interesting)
I see it in my self, on the rare occasions that I actually post, which is roughly 5-10 times a year and I see it with others whenever I go online to browse a little in the posts of the people I'm connected with ... called "Friends" (Fingerquotes!) on FB:
Facebook and other "social networks" encourage posing. No two ways about it.
If you get all worked up and batter your self esteem just because somebody posted himself in cool poses or on some event that you "missed out" on ... I get this a lot, since I'm only on FB for my tango dancing connections, a pastime where posing sometimes actually is part of the game. Actually knowing the person behind a neat facade on FB does put things into perspective.
Bottom line:
People shouldn't get more attached to these things than it is good for them. If this neat little stund by FB shows them that, then all the better.
My 2 cents.
FAKEbook indeed (Score:2)
this is junk science...not informed consent (buried in a EULA is not informed consent)
first and foremost, TFA description is wrong....TFA and the link research did NOT list what words make a post "positive" and "negative"
we cannot check their work by examining what factors they chose to represent the experimental variables
2nd, you're absolutely right that what people post to facebook.com is often not an accurate reflection of their current mood or actions
my only caveat is that some facebook users really don
Re: (Score:2)
Not only do they facilitate and encourage posing, the whole reason for "web 2.0" and "social networks" is absolutely nothing but posing on a global scale.
As old Stanley Milgram would have said... (Score:5, Funny)
Outrage MadLibs (Score:2)
For every action .... (Score:2)
You feel happy ... very happy ... buy our stock (Score:4, Funny)
... our stock is very happy ... buy our stock ... you like using Facebook ... you are happy when you use FaceBook ... buy our stock ...
facebook censors (Score:2)
I have had my own posts removed from my newsfeed. That should be the bigger issue - I really doubt I've been singled out. (looks out window for black heli..._)
Re: (Score:2)
Back when I used fb what I liked most was how you'd have a page with something actually important that loaded in a snap and fb would "fail" to thumbnail it with a nondescriptive error, but if you loaded some page full of bloatshit about something stupid it would thumbnail quickly and show right up so that you could share the stupidity to your heart's content, even if it was old now-non-news.
FB has always shown people only a subset of your posts, even if they explicitly ask to see all of them. That's a big p
Re: (Score:2)
They eventually revealed the reason they only show your content to a subset of your followers:
So they could charge you to reach more of them. Seriously. You can pay to "promote" your posts, and all that does is increase the reach within the people that have explicitly indicated interest in your content.
Re: (Score:2)
Someone I know tried that once on a lark, to see if it works. It doesn't.
Re: (Score:2)
You misread my post. I have had my posts removed from my newsfeed. Not talking about friends, etc. If I view my newsfeed sorted by "most recent", fb has removed content. If I post about puppies, shows up fine. So again, not only is fb denying my content to others, its mechanisms are denying it to me as well unless I view directly on my home page.
The father of propaganda would be proud (Score:4, Interesting)
In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons...who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind.
Future Facebook experiments! (Score:5, Interesting)
Hey, Facebook! Can you help me with some experiments of my own? I'd like to see the outcome if...
1. Right before an election, all posts favoring candidate X or the political views of party X were promoted to the top of everyone's feed and given 20 extra fake "Likes", while posts favoring the opposition are demoted and de-liked.
2. Phrases in posts favoring candidate X or the political views of party X are subtly "edited" when the appear in everyone else's news feed to be more positive (e.g., "like" to "love", "good" to "great"), while phrases in posts favoring the opposition are given the reverse treatment and sprinkled with misspellings.
3. FB users with a tendency of opposition to party X have random fake posts/comments from them appear in other's feeds only, in which they insult their friends' baby pictures, make tasteless jokes, and vaguely threaten supporters of party X to "unfriend me if ur so lame u can't take the TRUTH, lol".
Re: (Score:3)
Ok, guys, I swear that there were no misspellings when I typed "when they appear" in #2 above. IT HAS BEGUN.
THIS STORY MAKES ME ANGRY!!! (Score:2)
You're surprised that Facebook does this? (Score:1)
Facebook users gave up their privacy and allow their personal data to be mined. Posts have been used against them by employers, criminals, government agencies, various companies and Facebook. Facebook sells your data to advertisers and other organisations. This really comes as a surprise to anyone?
What Facebook has shown is that they can easily manipulate their users in a predicable manner. In this case it was for a study but is there anything stopping them doing something like this as a service to adve
Army funded (Score:2)
Re: (Score:3, Informative)
Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.
Re: (Score:2)
Impressions bite Social Media (Score:1)
There is way too much discussion of Facebook's legal standing here, and if you have ever seen a moot court, you can use legal reasoning and even the body of the law to argue either side. The law is based on competing priorities like politics is, and like economics.
A more telling result is the impression the disclosure leaves, which is why the story has legs. It is abit like what happened to Donald Sterling; something that seemed OK in one context got leaked into a different context where it appears total
Re: (Score:2)
both sides do it