by Steve Adubato, PhD
The news broke last week that Facebook conducted a psychological test with approximately 700,000 Facebook users in which it manipulated users’ newsfeed posts to see how they reacted to what they saw posted. Simply put, they secretly altered the newsfeed of the test subjects by changing the number of positive and negative posts. The goal? To examine how emotions can be spread on social media.
All of us who engage in social media know there is a degree of communication gamesmanship going on. We also accept that a lot of our information is being used and shared in ways we aren’t often aware of, but it is all part of the process.
But there is a big difference between sharing data versus manipulating it. Selectively filtering what people post is altering the communication process in a way that few, if any, Facebook users (including myself) would ever agree to.
To make matters worse, while Facebook’s policy now informs its users in the fine print of their user agreement that their information might be used for "internal operations" — including "research" — at the time of the study in January 2012, the policy did not include this language. It was added to the "fine print" in May 2012, which was four months after the study.
But those are the operative words — "fine print." If Facebook had nothing to hide and was proud of conducting this psychological test, then why not communicate in a more upfront and candid fashion? Why not use the bold and powerful Facebook platform to announce what it was doing and why it was doing it two years ago?
I would have respected Facebook’s leadership more if it would have said to all of its users in bold type (not fine print) the following: "We are conducting a research study to analyze how users’ moods are impacted by social media. Unless you opt out of this experiment, for one week your news feed may be manipulated at our discretion for research purposes."
If Facebook had communicated in this fashion, I would probably have opted out, as would many others. But at least Facebook would have been clear about its intentions and saved itself from this unnecessary embarrassment.
When companies say they have a legal right to do something because of a contractual agreement that customers sign, even if customers rarely understand or have even read the "fine print" of such an agreement, there is good reason to question the motives and intent involved.
It is reasonable to believe that Facebook had no intention of letting its users know what it was doing until after the results came back from this so-called experiment. One wonders, when they conjured up this idea in the Facebook boardroom (or wherever they make such decisions), if anyone asked how its users would react once they found out? Will it help or hurt Facebook’s reputation? Is this the smartest way to go about getting this information?
I’m betting such a candid, strategic and important conversation never took place for if it had, Facebook would likely have not found itself in such an embarrassing and awkward position. This was a very avoidable PR debacle by a very large and supposedly sophisticated organization. Just exactly what were they thinking?