• News
  • Columns
  • Interviews
  • BW Communities
  • BW TV
  • Subscribe to Print
BW Businessworld

Were You A Facebook Lab Rat?

Photo Credit :

Every now and again, Facebook ups and does something that gives its users a nasty surprise. Over the years there have been so many issues relating to privacy and the world’s biggest social network’s treatment of users, that it seems to merit a Wiki of its own. You never know when there are changes to the News Feed; you can’t predict when your name and photograph will appear where you didn’t give it permission to, and you can’t tell what data is being collected on you. In fact, Facebook is said to have fully cooperated with the US government for the collection of data on users though more recently Mark Zuckerberg protested to President Obama about companies like his having to compromise users’ privacy.

At one time, those who were fed up couldn’t fully delete their accounts because a mere de-activation didn’t amount to the data being erased. And in 2010, thousands of users called May 10th Quit Facebook Day because of privacy concerns. 

When Facebook bought Whatsapp recently, the one big worry was what the network would do with their messages and whether users would have content forced on to them.

So you would think that Facebook would now be more careful than anyone else with users’ privacy. But instead, in 2012, they carried out an experiment that has come to light now in which they tested out the impact of negative and positive content on users’ emotional states. The title of the study “Experimental Evidence of Massive-scale Emotional Contagion Through Social Networks” is enough to make users see red. What they did was to alter the natural order of things, if you can call anything on Facebook that, and modified the amount of positive or negative content up to 700,000 users would get to see on their news feeds. Then they sat back and watched what would happen. Just as one a person’s emotions can affect yours in direct interaction, they can do so via social networks – at least on Facebook.

I don’t’ know what they’ll do with the study, but one of the researchers involved, Kramer, defended it in a post and said though he was sorry it turned out this way, the idea had only been to understand Facebook’s users’ emotions because Facebook cares, after al. We don’t know if this is the only such study going on at Facebook, we don’t know how the data is used, but Facebook isn’t violating any laws because it’s covered by its user agreement and terms of use. The outrage over this study is more so because the almost 700,000 users were not asked – they were just used. All Facebook had to do, even if permission is implicit in its user agreement, is to ask nicely. I for one am sure enough people would have opted in.

But the whole incident sets me thinking that Facebook can hardly be the only one. Messenger services pick up indicators of user mood – and d god knows what with the data. Many companies ask for users’ agreement to collect data, but are hardly transparent about how that data is eventually used. Facebook user emotions were “manipulated” but I’m thinking of the hundreds of serious ways we are already being manipulated outside of Facebook. Advertising, television, social networks that show a picture that isn’t, and so much more. Subliminal messages creep though to us on an everyday basis. Sadly, Facebook is hardly the only one trying to tweak your emotions.