The fact is, Facebook has no obligation to take care of you.
Facebook has no obligation to make your life easier. Facebook has no obligation to pay more attention to your well-being than its own revenue. Facebook has no obligation to look out for your welfare. Facebook has no obligation to be the good guys. They really don’t.
What, did we expect them to?
Well, maybe that’s not the right question.
The conclusion: yes, “emotional contagion” can be spread through social media sites, as proven by an experiment run by Facebook’s Data Science team one week in January of 2012. News feeds were manipulated into two groups: “one in which exposure to friends’ positive emotional content in their News Feed was reduced, and one in which exposure to negative emotional content was reduced.” In other words: half of the 689,003 unknowing participants received happier news feeds; half of them received sadder news feeds.
The effect? Well, says Facebook, “the effect sizes from the manipulations are small (as small as d=0.001.)” But, “an effect size of d=0.001 is not negligible: in early 2013, this would have corresponded to hundreds of thousands of emotion expressions in status updates per day.”
And the reaction?
Some choice quotes from that compilation by the Wall Street Journal: Facebook is “using us as lab rats”, its behavior is “completely unacceptable” and “unethical.” “Is it okay for Facebook to play mind games with us for science?” asks one blogger. “Emotional manipulation is emotional manipulation,” writes another. One writer even describes Facebook as a “vampire goblin.”
But as others point out, maybe the most shocking part of this entire debacle is that people are shocked. Facebook has never been trumpeted as a paragon of corporate ethics, after all—and besides, what exactly did they do? Fail to show you your entire news feed? Is a requirement to show you your entire news feed in the Terms of Service? And with the sheer amount of personal data that every user hands over to Facebook daily, is anyone actually surprised that Facebook chose to use it?
So what, then, is the issue?
Well- the short version is that there are a ton of issues, among them that a) the study was probably illegal, b) Facebook only wrote permission to conduct the study into its ToS four months after the study had concluded, c) the study may have included users under the age of 18, etc, etc. But these are revelations that all came out in the days after the outrage. People are mad, but they aren’t mad because of these points.
The reason for the anger, I think, has nothing to do with the study at all.
Or it does—but only tangentially. Because here’s the main argument advanced by those who say we shouldn’t be shocked by this: you chose to use Facebook. You chose to opt into its services, knowing that it’s a corporation like any corporation, with no actual obligation to take care of you. You chose to leave yourself open.
If you don’t like it, you can leave.
So I stared at my Twitter feed, looking at the outrage, looking at the fury; and I was angry that I might have been one of the users experimented on—especially since I was hovering on the border of full-blown depression at the time of the experiment, and the wrong stimulus certainly could have had a very negative effect on both my short-term and long-term mental health—and the thought did, indeed, pop into my head: Can I just protect myself from this happening again? I chose to use Facebook, back in 2007; can’t I just choose to stop using it?
Well, let’s run a little imaginary scenario: Hannah Deletes Her Facebook, July of 2014.
The first thing I lose is a source of updates on the news. The “Trending” bar, the various opinion pieces my friends post; these are gone. That’s more or less okay, though—I have a Twitter, I have a New York Times account, I’ll survive.
The second thing? A large part of my ability to participate in the Student Net Alliance, receive updates on my dorm, get in contact with NYU people. All of these communities, of course, have Facebook “groups” that they use to talk to each other and to tell me important news. I can arrange with some of these groups to email me updates, but it’ll be inconvenient for them, and I’ll be cut off from the community.
The third is my ability to contact more or less everyone I knew in high school. I got a new phone for what I’m pretty sure was my 18th birthday (I’m checking my Facebook timeline to confirm this, but can’t find evidence) and didn’t end up getting most of my friends’ phone numbers before we all scattered off to college; besides, we mostly communicated through Facebook anyway. I’ve used Facebook to keep up with not only my best friends from high school, but also acquaintances and people I know I’ll want to contact again someday; it’s a social network, that’s what it’s for.
The fourth thing is more subtle: everyone knows that employers look up potential hires on Facebook (which is why you shouldn’t post pictures of you and your friends getting blitzed, folks, or, uh, at least not publicly). Not having a Facebook for them to check out isn’t as bad as having a Facebook and having really worrying photos on it, but it’s definitely not a great thing.
The fifth thing: every single site or service that requires you to log in through Facebook. Which is a fair number of them.
Do I need to go on? It’s fairly plain that the cost of deleting my Facebook far, far outweighs the benefits of not being experimented on, and the cost of deleting my Facebook would continue to outweigh the benefits if Facebook were to do far worse things—I can’t choose not to have a Facebook at this point. Not opting into Facebook simply isn’t an option.
And this, I think, is where the anger comes from.
It’s not the sudden jolt of realizing you were experimented on; it’s the sudden jolt of realizing that there isn’t a hell of a lot you can do about it. Sure, you can complain; sure, you can write angry letters to the people at Facebook; sure, you can call Facebook a “vampire goblin” all you like.
But you can’t actually take your business somewhere else.
I don’t think anyone expected Facebook to take care of their welfare. But they did expect to be able to say, at some point, “Screw this, I’m exercising my right to use the best product available, and your product has become inferior.” And they can’t do that.
Maybe the market will turn over in a few years, and Facebook will become obsolete. But right now, it’s a long, long way from that. And the consumer has lost the right to choose whether or not to opt into its services.
Now that’s some negative emotional content.