Consent of Human Subjects: Subjects not asked for permission first.
Findings: Extremely small effects.
Research methodology: Poor algorithms used, questionable findings.
Key finding: A reduction in negative content in a person’s newsfeed on Facebook increased positive content in users’ posting behavior by about 1/15 of one percent!
We address 3 questions
1. Why did some of the checks and balances possibly fail?
2. Should we worry about the study’s findings?
3. What benefits do Facebook users get out of this study?
Non-techie description of study: News feed: ‘Emotional contagion’ sweeps Facebook
1. Some checks and balances failed
Following the spirit as well as the letter of the law is the key to successful compliance. In turn, any governance depends upon the participants doing their job thoroughly and carefully.
In this case, the academics thought this was an important subject that could be nicely studied with Facebook users. They may not have considered how much it might upset users and the media.
Cornell University has a its procedure in place for getting approval for research with human subjects. As the image below illustrates, the researcher is expected to reflect on the project and if in doubt, ask for help.
The university points out that it did not review the study. Specifically, it did not check whether it met university guidelines for doing research with human subjects. The reasons given were that its staff:
Curious? Join 1500 other subscribers to this blog’s newsletter and read on! Pls. use an e-mail that works after you change jobs!
– did not collect the data, and
– did not do the analysis.
Cornell University attempts to minimize its role in this tussle, but their statement shows they are simply doing damage control:
Media statement on Cornell University’s role in Facebook ‘emotional contagion’ research
US compliance: Federal Policy for the Protection of Human Subjects (‘Common Rule’)
Also interesting: Parsing the Facebook paper’s authorship and review
What about Facebook? By consenting to Facebook’s Data Use Policy — which, if you have a Facebook account, you did — you gave Facebook permission to use you as a test subject if necessary. Unfortunately, Facebook did not include the ‘research’ portion of this policy until May 2012.
Hence, when the study was conducted, users had not yet consented to this new policy. The policy states that the information may be used:
“…for internal operations, including troubleshooting, data analysis, testing, research and service improvement…“
Even so, making users submit to a catch-all list that does not provide a clear, specific description of a study fails the consent standard. The above illustrates that Facebook does not feel the need to ask your permission before carrying out its tests.
Adam Kramer – the Facebook employee AND first author of the paper – released a statement on his Facebook page (June 29, 2014) indicating that internal review practices were in place at Facebook. What they are and how they were applied to this study remains a mystery.
The paper was published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS) and handled by the subject area editor Susan T. Fiske, a psychologist.
Ms Fiske had some concerns, but as the image above and the one below both illustrate, it does not appear that the researchers submitted a form from their university. Nor did Adam D. I. Kramer submit anything verifying his claim that Facebook’s committee on Research with Human Subjects had reviewed the study.
I guess editor Susan T. Fiske thought the information provided by the authors was sufficient… Apparently, the authors told her things were okay. If you cannot trust a fellow academic – whom can you trust? Insert sarcasm here.
2. No need to worry about the study’s findings
Previous experiments show that emotional contagion in real-world situations happens. For instance, interacting with a happy person is infectiously pleasant. Of course, crossing swords with a grump can launch an epidemic of grumpiness. Posting negatively worded stuff on Facebook does not make you many friends (see Forest, A. L., and Wood, J. V. (2012). When social networking is not working: Individuals with low self-esteem recognize but do not reap the benefits of selfdisclosure on Facebook. Psychological Science, 23(3), 295–302. doi: 10.1177/0956797611429709 Retrieved May 16, 2012, from http://pss.sagepub.com/content/early/2012/02/07/0956797611429709.abstract).
This is the second study I have come across that tries to assess ‘contagions’ focusing on online exposure to mood-laden text. The current Facebook study revealed that reducing negative posts led to an increase in positive posts. Increasing negative posts in a subject’s news feed led to a decrease in positive posts by that person.
How big is the effect you might ask? Kramer et al. (2014) found a 0.07 percent (i.e. about 1/15 of one percent) decrease in negative words in people’s status updates when the number of negative posts on their Facebook news feed decreased.
You have to write a few thousand words before you might have written one less or more negative word, so this effect is VERY small. The study relied on the Linguistic Inquiry and Word Count (LIWC) text analysis software, which in turn relies on a dictionary of words classified as “sad” or “angry” or “happy”. It counts their use and then arrives at an estimation of emotions expressed in a given text. But does it work? The authors explain the use of this tool here:
“Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) word counting system…”
The study states: “People who viewed Facebook in English were qualified for selection into the experiment.” I for one view Facebook in English but often post in German as well. For my case and others in similar positions (e.g., view it in English, but also post in Spanish) the program would totally fail.
To make my point, I entered the text below in the tool LWIC Light. The screenshot shows the result I got. Ignore the right two columns, which give the average scores for previous analyzed texts.
Even if we take English sentences, the algorithm’s results do not instill trust in these findings. Such attempts to map natural language onto mathematical criteria are fine, but they fail to detect sarcasm. For instance:
This one has zero negative words according to LIWC’s count, thereby demonstrating that the program cannot handle this properly.
3. Benefits for Facebook users? What benefits?
Facebook has tested before and learned that when people see more text status updates in their news feed, they write more status updates themselves. Nevertheless, if the validity of data are questionable because of the tool used, what else if anything can we learn from these the findings?
Given the issues mentioned above, should I trust the results? Not if you watch this 44-second video of Ronald Reagan with Mikhail Gorbachev:
So we have to wait for another study. That one will hopefully give us more insights about how and why certain positively worded content influences your peers positively on Facebook or not.
I am pretty sure that the study’s results will be repeated with larger effects (assuming the language coding improves) pretty soon.
Interesting read: In defense of Facebook
Finally, everybody including family and lovers are trying to understand your behaviour in the hope of getting you to do something. You decide whether this is manipulation or not. However, while your mother wants you to eat more vegetables and fruit, your boss wants you to work longer today without getting overtime and so forth.
Facebook already ranks how you interact with your news feed by using an algorithm called EdgeRank. It measures things such as how frequently a news feed owner interacts with its author and the quality of that interaction. In the latter case, a comment is of course more valuable than a Like.
You can tweak the settings to make posts appear in their “natural” order.
By the way, if you feel you have wasted too much time on Facebook, why not check? Just download your Facebook history. This is available under ‘General Settings’. If you want to delete your account, you have to fill out a brief form.
VERY interesting read – ABSTRACT: Experimental evidence of massive-scale emotional contagion through social networks
Full Paper: Kramer, Adam, D. I., Guillory, Jamie, E., and Hancock, Jeffrey, T. (June 2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 111 | no. 24, pp. 8788-8790. doi: 10.1073/pnas.1320040111 Retrieved, June 30, 2014 from http://www.pnas.org/content/111/24/8788.full
What do you think?
Facebook has made no secret that its news feed is a manipulated version of reality, and their properietary EdgeRank algorithm helps. It selects the post and links from your friends to display on your feed that it has found through testing are the most likely to interest you. Moreover, it knows which ones will encourage you to respond and post yourself.
How often do data scientists look at data for marketing purposes? Do you ever hear about this? Most likely not. The best part is that Facebook published these data and did not keep them under wraps. It is certainly nicer to know what purpose you were manipulated for, than being left in the dark. Kudos to Facebook for letting us know what they are trying to understand and not keeping these things secret! Plus, if you always tell people beforehand that they are part of an experiment, it can influence your findings (sometimes called response bias or halo effect).
Nevertheless, here we have been the product or guinea pig that Facebook tested in this experiment. And while this raises ethical issues, it is an advertising-funded social network. I find it much more creepy to think about all the studies where we are the product that a company has been testing but we never know about it (e.g., Edward Snowden raised a few similar issues…).
– What is your experience, do people feel differently and thus post differently, when they see positive or negative posts in their feeds?
– The Office of the Data Protection in Ireland has asked Facebook to clarify privacy matters, including consent for this study. Do you think this is an issue for regulators?
Have your say below.
Hooray – you read the whole post by author Urs E. Gattiker – aka DrKPI! Want to hang out more? Check out the news updates on Twitter, join our Social Media Monitoring discussion group on Xing, chat with us on Google+, and receive your fortnightly updates and behind the scenes scoops through our newsletter.
Urs’ latest book, Social Media Audits: Achieving deep impact without sacrificing the bottom line was published in April 2014 by Chandos Publishing / Elsevier – blog readers => grab your 25 percent discount with free shipping now. Extra Tidbit: Need to do an audit? Get Social Media Audit: Measure for Impact (Springer Science Publishers).