Facebook Cofounder Mark Zuckerberg
WHO Facebook Menlo Park, Calif.
FIELD Social Sciences
ACHIEVEMENT Studies that shed new light on human behavior.
Few scientific studies have ever generated more outrage than by the journal Proceedings of the National Academy of Sciences. In it researchers revealed that they had altered the Facebook feeds of nearly 700,000 users. Some users were shown more "happy" posts, while others saw more negative content—all without their knowledge, of course. The researchers, working with Facebook, then tried to measure the readers' emotional responses.
After the study was published, some flipped out. Users pelted Facebook with angry comments, tweets, and posts. The journal itself issued an "," saying "the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out." One of the authors, Facebook engineer Adam D. I. Kramer, posted his own apology on his Facebook page: "Our goal was never to upset anyone," he wrote. , writing in The New York Times, said that if the study had been federally funded, "such a complacent notion of informed consent would probably have been considered a crime." Yet almost nobody said what they should have, which was, "Cool."
Facebook is a vast petri dish of human connectedness. For researchers, it presents a writhing, roiling, and endlessly fascinating data set—an opportunity to listen in on hundreds of millions of conversations every day.
"I'm not sure there's a more valuable resource that's ever been created for social science research," says James Fowler, a professor at the University of California, San Diego, and coauthor of . "There's a thousand years' worth of work to be done there, and I'm only going to get to do 20 of them." Fowler has teamed with Facebook on several projects, including a study on how social media affected voter turnout for the 2010 midterms. They showed get-out-and-vote messages to 61 million people and measured the outcome; it turned out those who saw pictures of close friends with a message were more likely to vote. "It worked out to a quarter-million more people voting," says Fowler, "and we didn't hear any blowback at all."
The more recent study was innocuous compared to such manipulation of the democratic process. It examined the trope that Facebook posts about people's perfect lives make others feel sad about their own existences. It found the opposite, by a slight margin: Happiness is slightly contagious. Another discovery: If you mess with people's kitten-video feeds, they get angry. But here's a news flash: If someone is trying to sell you something, chances are they're also trying to manipu late you in some way. All the time. Big companies do research on customers every day. It's why stores put sugary cereals at a level where the characters on the boxes can look kids in the eye. The difference is Facebook published the findings, along with dozens of other papers it collaborates on annually. In his apology, Kramer made an interesting point: "The goal of all of our research at Facebook is to learn how to provide a better service." In other words, it's a product. You are its consumers. Caveat emptor.
We swim in a sea of electronic information. And while we should know better by now, we still delude ourselves that what we see on our screens somehow represents an objective, unfiltered reality. We think the Internet isn't pushing our buttons. The emotional-reaction study shattered that illusion, if only briefly. And it isn't only Facebook; Google has admitted to manipulating search results, tweaking its algorithm to try to bury sites it deemed spam and content farms such as . It could do the same for less honorable reasons. A few years ago I searched for "worst president ever," and the top result was then-president George W. Bush's official White House page. Amazon does it, too, most recently by hiding or removing the Buy button for titles published by Hachette Book Group, a publisher with which it is in a dispute.
These are just some of the examples we know of. The happiness study was a useful wake-up call: You're not just checking Facebook; it's checking you. But Facebook stands out for sharing what it learns in ways that could benefit all of us.