In 2012, Facebook Altered Content To Tweak Readers' Emotions 130
The Atlantic reports that two years ago, Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone, negative or positive, and observe the results. From the Atlantic article: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
At least they showed their work.
consent (Score:5, Interesting)
There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...
Facebook encourages posing. (Score:5, Interesting)
I see it in my self, on the rare occasions that I actually post, which is roughly 5-10 times a year and I see it with others whenever I go online to browse a little in the posts of the people I'm connected with ... called "Friends" (Fingerquotes!) on FB:
Facebook and other "social networks" encourage posing. No two ways about it.
If you get all worked up and batter your self esteem just because somebody posted himself in cool poses or on some event that you "missed out" on ... I get this a lot, since I'm only on FB for my tango dancing connections, a pastime where posing sometimes actually is part of the game. Actually knowing the person behind a neat facade on FB does put things into perspective.
Bottom line:
People shouldn't get more attached to these things than it is good for them. If this neat little stund by FB shows them that, then all the better.
My 2 cents.
Filter bubble (Score:5, Interesting)
What actually disturbs me more is: why should they do this? The answer is simple: They want to determine the most effective non-obvious way of creating filter bubbles to make the user feel well and stay longer.
It is so-to say a "second order filter bubble", i.e. the use of a positive feedback mechanism.
It's called the Common Rule (Score:5, Interesting)
It's called the Common Rule [hhs.gov], although it generally only applies to federally funded research. There is some evidence [cornell.edu] that this study was in part federally funded. I think there are serious questions about whether a click-through agreement meets the standards of informed consent.
Although the study was approved by an institutional review board, I'm surprised, and the comment from the Princeton editor makes me wonder how well they understood the research design (or how clearly it was explained to them). This would never have gotten past my IRB.
Re:consent (Score:2, Interesting)
There are laws governing obtaining informed consent from humans before performing psychological experiments on them. I doubt that a EULA can override them. This should be interesting...
There is no such law. In any case, this is the the basis for the entire news business. Why do they report murders but not acts of charity (unless a celebrity is involved)? It is all about getting you to watch and see ads. Facebook is doing nothing that TV news hasn't been doing for years.
Re:consent (Score:3, Interesting)
That said, I do find this whole thing to be a little shady...but I'm not sure it's a particularly rational reaction, given that I rarely care about A/B testing when it's being used to shamelessly make money off of me...
Re:consent (Score:2, Interesting)
The father of propaganda would be proud (Score:4, Interesting)
In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons...who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind.
Re:consent (Score:5, Interesting)
There are laws against assault, bullying, and so on. The positive spin in innocuous but the negative spin is not.
With 700 000 potential victims, the numbers are against them because when your sample size is that large outliers are the rule and not the exception.
The risk of copycat suicide for example should have been obvious to those conducting this study.
Future Facebook experiments! (Score:5, Interesting)
Hey, Facebook! Can you help me with some experiments of my own? I'd like to see the outcome if...
1. Right before an election, all posts favoring candidate X or the political views of party X were promoted to the top of everyone's feed and given 20 extra fake "Likes", while posts favoring the opposition are demoted and de-liked.
2. Phrases in posts favoring candidate X or the political views of party X are subtly "edited" when the appear in everyone else's news feed to be more positive (e.g., "like" to "love", "good" to "great"), while phrases in posts favoring the opposition are given the reverse treatment and sprinkled with misspellings.
3. FB users with a tendency of opposition to party X have random fake posts/comments from them appear in other's feeds only, in which they insult their friends' baby pictures, make tasteless jokes, and vaguely threaten supporters of party X to "unfriend me if ur so lame u can't take the TRUTH, lol".