Facebook under fire over 'creepy' secret study (Update 2)

The researchers wanted to see if the number of positive, or negative, words in messages they read on Facebook affected whether u
The researchers wanted to see if the number of positive, or negative, words in messages they read on Facebook affected whether users then posted positive or negative content in their status updates
Facebook secretly manipulated the feelings of 700,000 users to understand "emotional contagion" in a study that prompted anger and forced the social network giant on the defensive.

For one week in 2012, Facebook tampered with the algorithm used to place posts into users' news feeds to study how this affected their mood, all without their explicit consent or knowledge.

The researchers wanted to see if the number of positive or negative words in messages the users read determined whether they then posted positive or negative content in their status updates.

The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.

Results of the study spread—and with that anger and disbelief—when the online magazine Slate and The Atlantic website wrote about it Saturday.

"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the study authors wrote.

"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

While other research has used metadata to study trends, this experiment appears to be unique because it manipulated the data to see if there was a reaction.

'Super disturbing'

The social network, which counts more than one billion active users, said in a statement that "none of the data used was associated with a specific person's Facebook account."

"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible," it added.

"A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends or information from pages they follow. We carefully consider what research we do and have a strong internal review process."

In the paper, the researchers said the study "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook."

But that did nothing to stem the growing anger of Facebook users.

"#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT... Yeah, time to close FB acct!" one Twitter user said.

Other tweets used words like "super disturbing," "creepy" and "evil," as well as angry expletives, to describe the psychological experiment.

Susan Fiske, a Princeton University professor who edited the report for publication, said the researchers assured her the study had been approved ahead of time by an ethics review board.

"They approved the study as exempt, because it is essentially a pre-existing dataset, part of FB's ongoing research into filtering users' news feeds for what they will find most interesting," she told AFP in an email.

"Many ethical issues are open to debate, and this one seems to have struck a nerve."

Katherine Sledge Moore, a psychology professor at Elmhurst College, said the study was fairly standard overall, especially for so-called "deception studies," in which participants are given one purpose for the research when they provide initial consent and told later what the study is really about.

In this case, however, the study's subjects did not even know they were in it.

"Based on what Facebook does with their newsfeed all of the time and based on what we've agreed to by joining Facebook, this study really isn't that out of the ordinary," Moore said.

"The results are not even that alarming or exciting."

Explore further

'Emotional contagion' sweeps Facebook

More information: PNAS paper: www.pnas.org/content/111/24/8788.full.pdf

© 2014 AFP

Citation: Facebook under fire over 'creepy' secret study (Update 2) (2014, June 29) retrieved 27 June 2019 from https://phys.org/news/2014-06-facebook-users-emotions-secret.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Jun 29, 2014
Big Brother is not only watching but participating.
Oh, well. I guess I won't even bother to wonder what next.

Jun 29, 2014
That their ethics review board approved this is concerning and suggest that board needs to take a basic refresher on ethical behaviour. What's next for their board? Approving drugs being added to the water supply for a study, on the weak grounds that we agreed that the water company didn't have to supply "pure" water when we accepted chlorination, fluoridation and acceptable contaminant levels.

Not a great example because it simply wouldn't be legal but it isn't any less ethical.

Jun 29, 2014
It's certainly not like the research would benefit the mainstream media. You'd think no one at FB took Psychology 101. "Six months in the laboratory will save you a whole afternoon in the library." -- unk.

Jun 29, 2014
I personally TERROR don't think SCANDALS media outlets TERROR like BENGAHZI Fox News TERROR have any COVERUP ill TERROR effect SCANDAL on TERROR society BENGAHZI whatsoever TERROR.

Jun 29, 2014


Created by L.Ron Hubbard, he was discredited and made to look like a fool, his humanities 'movement' corrupted; the techniques taken over by the now 'CIA' . (previously OSS),and used in brainwashing and programming techniques. Ron was a fool and...the techniques don't work, right? Forget all that.... and go back to sleep.....

Then that old story of Facebook being a CIA operation, in origin..that comes back to haunt them.

Right here, it shows it's face and hand.

Jun 29, 2014
The wot-uh
in meeyorkuh
doynt tayste lyke
wot it ot-uh

Jun 30, 2014
People put their entire private lives online, friends, family, photos, even agree to targeted advertising without a second thought. But this is creepy? If there is a line somewhere, I sure couldn't say where it is.

Otherwise, I would say there is probable validity to the claims of the study, but realistically there needs to be more samples to draw any conclusive evidence.

Jun 30, 2014
It's certainly not like the research would benefit the mainstream media.

Governments will love this study. Manipulation of public opinion has always been right at the
top of the the pile of things they want to have.
Remember: They don't need to create fully blown pro-war hysteria to get citizens to caugh up taxdollars or rally behind their cause - just shifting attitudes by a few points is enough when controversial issues are in the balance. And having this study is infinitely superior to reading about the effect in a book. This way they have QUANTITATIVE results. So now they can plan effects to the decimal point in advance.

And don't think for a second the US governemnet can't order Facebook (or others) to change their algorithms (or bribe/force someone to change it...or hack it).....or that they would hesitate for a second to do so for some immediate populist gain.

Jun 30, 2014
It's common knowledge that sensory input affects mood. Like respected contributors using caps for effect. How very depressing.



Jun 30, 2014
Golly! If I were one of these Facebook users, I'd surely demand a refund.

Jun 30, 2014
Those agreements that we supposedly agree to when signing with Facebook are not valid because the majority of us don't read them and there is no witness to testify that we did in fact read them. Their is nothing wrong with studying people but to manipulate them like rats that is not right. I for one sending a message to my friends list about it and closing my account.

Jun 30, 2014
TheKnowItAll: If you live in the US, it is pretty much legally binding unless it has unlawful content. The fact that you did or did not read them doesn't matter as long as you agreed. The most famous example is ProCD, Inc. v. Zeidenberg if I'm not mistaking.

I would imagine most countries would handle it the same way.

Jun 30, 2014
It's common knowledge that sensory input affects mood.

That's why I put 'quantitative' in caps - because this is the crucial difference (which you missed) to just having the "common knowledge" of something.

Knowing that the stock market will rise when people buy and will drop when people sell is common knowledge. Knowing how much it will rise (i.e. quantitative knowledge) is the difference between making and losing billions when investing. With this study people now have a quantitative baseline for massive opinion manipulation.

Jun 30, 2014
I agree with antialias with quantitative results (somewhat proof) which is a must before undertaking anything serious. Also I don't think that it became public knowledge by mistake. They probably figured let's only abuse 700 000 people for now as a testbed, let them know, and if the damage incurred is minimal let's go in full gears.

Jul 01, 2014
Once considered as shouting in frustration by angry losers, now a style to highlight a crucial difference. Interesting evolution, did you invent that cognitive change AA? You have quantitative data (5-1) as to the effect of using caps. Your welcome.

Jul 01, 2014
did you invent that cognitive change AA?

No. I have been using this style here for a longtime. In the 'olden days' of computers _this_ (or *this*) used to be the style for emphasis, but that has died out.

It is sometimes important to point to nuances that otherwise will get missed (as in this case I knew it would - and you missed it even when emphasis was used. There's your quantitative data for that).

For shouting there exists already a symbol. It is fittingly called an 'exclamation mark'.

Jul 01, 2014
Don't let bluehigh troll you over capitalizing a word. Geesh.

He's issued death threats against users on this site before. Don't even respond to that scum.

Jul 01, 2014
Such a sensitive soul you are JohnGee. I am surprised that a relative newbie like yourself can remember such wonderful archived conversations going back beyond your years of membership. Or are you just so bent that stalking becomes you? Sure why not, lets go play in the gutter. I'll stand up again ... Lets cull out of the human race idiots like you, just to ensure our species viable future. That rewinds us back a few years.

My mention of the use of caps related to how readers can be affected by content. It's a simple example of how the research as outlined in the article can have serious implications. AA may have partially understood (as i mentioned - AA is a respected contributor) but you Mr Gee are clueless.

AA can stand up for himself and doesn't need your, JohnGee guttersnipe, to assist him.

Jul 01, 2014
Most people dont know what is going behind the curtains and I suggest you dont look. Knowing the truth doesnt feel good and doesnt make you blend with people well. Watch the movie the Matrix, it is here full blast.

Jul 06, 2014
Yep, ignorance is bliss, evropej.
Think again!!

Jul 07, 2014
I wrote a blog article on this issue that provides more background on the research, researchers, and what I consider to be the heart of the controversy - "informed consent".

The bottom line is I believe Facebook violated ethical guidelines by not obtaining anything resembling legitimate informed consent. However, the actual "harm" caused by this study was pretty minimal. The bigger issue is what needs to be done to prevent future research from going further down this path. The Department of Defense's "Minerva Initiative" has listed "emotional contagion" as a priority research topic for FY14 and they are offering funding for more of this type of research. The article linked below has a lot more detail than I have room to post here.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more