By now you’ve probably heard the news: Facebook has been messing with your mind.

Over the weekend, Facebook was in the headlines when it was discovered that the social network had performed a bit of a social experiment on its users. For a week in early 2012, Facebook’s algorithms manipulated the news feeds of roughly 700,000 users by showing them more positive or more negative posts, all in an effort to determine if that would have an effect on the positivity or negativity of those users’ posts after the fact.

The results were published in a study called “Experimental evidence of massive-scale emotional contagion through social networks,” which is a fancy way of asking, “does seeing sad things make someone sad?” The answer, of course, is yes, it does. Since then, there’s been a pretty hefty public backlash against Facebook for manipulating users’ emotions. A post on TechCrunch says that it’s unethical, and that Facebook has breached the “unwritten social contract” between users and the service.

Facebook hasn’t ignored the outrage, though. The company researcher who oversaw and coauthored the study, Adam Kramer, offered an explanation as to the motivations of the experiment:

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

So is what Facebook did wrong? I’m not so sure. While I’m not a big fan of Facebook in general, I have to wonder whether or not the anger is misplaced in this instance.

ALSO READ
3 things not to do when Bitcoin is going down

For starters, posts were chosen for their negative or positive content based on emotional words chosen by those doing the posts. So in terms of users’ privacy, it’s not like some dude in a lab coat was scrolling through newsfeeds to see what people were writing. It was all automated.


Secondly, as is often the case, users seem to forget that Facebook is a corporation that provides its services free of monetary cost. Users get the benefit of connecting with each other, organizing social events, discussing topics of the day, or lusting after pictures of Kate Upton all without paying a dime. In return, we are providing Facebook with lots and lots of data, with which they can do what they please.

The result of that agreement is often kind of gross, but to be surprised and outraged by it at this point is beyond the point of naïve. The data we’re uploading to Facebook – be it where we work, where we went to school, what bands we like, and even pictures of our friends and relatives – is Facebook’s to do with as it pleases. It’s 2014, and this is old news, so to get angry about the company using that data for whatever it wants is pointless.

At the end of the day, if you don’t like the idea of a social network being able to use your data for whatever purpose it wants, then don’t use it. No one is forcing you to use Facebook – it’s all voluntary. I have friends who I still manage to see and communicate with outside of Facebook all the time. A lot of the time, only interactions we have on the site are posting dumb pictures of baseball players making stupid faces, and then writing sarcastic comments about said pictures.

ALSO READ
Why research should be the driver of your digital campaign

If you don’t like the way Facebook uses the information you’re willingly and voluntarily turning over to them, then your choice is clear: disconnect from Facebook and don’t look back.

  • Every media platform is used to influence behavior, so its not surprising Facebook will do the same. After all, it will be stupid of them if they have a billion users posting personal info in droves and cannot use it for their advantage. All news channels and radio shows have been doing this for years and they didn’t ask for consent, they don’t put a disclaimer that says their version is the complete truth.


  • >
    Share This