U.S.
Chris Ratcliffe / Bloomberg / Getty Images

Facebook's emotional manipulation raises ethical questions, say experts

Social media site tinkered with 689K newsfeeds; similar academic study would have needed ethical review board approval

Facebook has a reputation for pushing the boundaries of social interaction. But by subjecting thousands of human participants to involuntary testing of their emotional wellbeing, some say the company may have bypassed ethical standards that usually protect people from being used in academic and governmental research without their consent. And although as a private company they may have been entitled to do so, some are questioning both Facebook's decision and the apparent hole in regulation that allowed the study to go ahead unchecked.

For a week in January 2012, the researchers unleashed a rigged algorithm on thousands of users’ news feeds that measured their reactions to content and found the first evidence of what the researchers called “mass-scale” online emotional contagion. When positive expressions were reduced, people produced fewer positive posts and more negative posts, according to a paper published in scientific journal the Proceedings of the National Academy of Sciences.

The Cornell University researchers involved in the study filed an application to the school’s Institutional Review Board, in accordance with regulations put in place after past experiments that had adverse effects on people’s mental and physical wellbeing. The board screens studies for ethical compliance, which includes informed consent. But Cornell told Al Jazeera that because their institution was not conducting the research, merely using the data, the board decided the study did not have to be referred to the Cornell Human Research Protection Program.

“Because the research was conducted independently by Facebook and Professor Hancock had access only to results — and not to any data at any time — Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required,” the college said in a statement.

Each institution that conducts independent research has an IRB board made up of five members which screens proposals. One of the ethical requirements they look for is informed consent, according to a statement of the American Association of University Professors, which declined to comment for this story.

According to its user agreement, Facebook reserves the right to gather and analyze wide swathes of data. But Kate Crawford, visiting professor at the Massachusetts Institute of Technology’s Center for Civic Media, said the case exemplifies a lapse of ethics by the technology giant, which uses data in ways largely unbeknownst to the public.

"It's completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments," Crawford told the Wall Street Journal. Ethics are not "a major part of the education of data scientists and it clearly needs to be," she added.

Patrick O’Donnell, a psychology professor and deputy head of the school of psychology at the University of Glasgow in Scotland, said that while human subjects often aren’t informed of the hypothesis of an experiment they’re participating in, they’d certainly need to be aware that they were part of an experiment.

O’Donnell said that even when partnering with private industry you still “apply to an ethical committee for approval” and specify the details of that partnership in the research proposal.

“One of the key ethical guidelines is that subjects cannot be deceived unless there are extraordinary circumstances” for doing so, he added. “The position in our university and every U.K. university is that you’d have to go through our ethical committees at the university level,” he said.

Meanwhile, Facebook is facing a backlash over the study.

On Sunday, many took to social media to express their anger and demand an explanation of why the company tinkered with 689,003 users’ newsfeeds and emotions. One user on Twitter said Facebook could have even caused someone to commit suicide by altering his or her newsfeed and, consequently, mood.

“I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible,” privacy activist Lauren Weinstein tweeted. One researcher at Ohio State University found teenagers frequently use social media as a primary means to communicate suicidal thoughts to friends and family, according to a study analyzing MySpace content published in the Cyberpsychology, Behavior and Social Networking journal.

In a public apology on his wall, lead researcher from Facebook Adam Kramer said he was “sorry” and regretted not having explained the purpose of the research, which he said was designed to examine whether criticism that Facebook causes feelings of jealousy or exclusion among users could have merit.

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused,” he wrote.

Kramer said the backlash has contributed to the company's decision to carefully review its internal practices since conducting the study in 2012. “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he added.

Facebook did not respond to questions from Al Jazeera on the specifics of its internal review practices.

The University of California in San Francisco — a researcher at which co-authored the scientific study — also declined to comment, but said the staff member who was involved was at the time still employed by Cornell.

Related News

Topics
Facebook

Find Al Jazeera America on your TV

Get email updates from Al Jazeera America

Sign up for our weekly newsletter

Related

Topics
Facebook

Get email updates from Al Jazeera America

Sign up for our weekly newsletter