Editor’s Note: Robert Klitzman is a professor of psychiatry and director of the Masters of Bioethics Program at Columbia University. He is author of the forthcoming book, “The Ethics Police?: The Struggle to Make Human Research Safe.” The opinions expressed in this commentary are solely those of the author.

Story highlights

Facebook conducted a study on nearly 700,000 users by manipulating their news feeds

Robert Klitzman: Facebook basically tried to alter people's mood without their knowledge

He says despite Facebook's user policy, this study violates accepted research ethics

Klitzman: We should try to avoid as much as possible becoming human guinea pigs

CNN  — 

Like many people, I use Facebook to keep up with friends about all kinds of things – deaths, births, the latest fads, jokes.

So I was disturbed to learn about an article, “Experimental Evidence of massive-scale emotional contagion through social networks” published last week in The Proceedings of the National Academy of Science (PNAS).

Facebook had subjected nearly 700,000 users in an experiment without their knowledge, manipulating these individuals’ news feeds, reducing positive or negative content, and examining the emotions of these individuals’ subsequent posts.

Robert Klitzman

Facebook essentially sought to manipulate people’s mood. This is not a trivial undertaking. What if a depressed person became more depressed? Facebook says that the effect wasn’t large, but it was large enough for the authors to publish the study in a major science journal.

This experiment is scandalous and violates accepted research ethics.

In 1974, following revelations of ethical violations in the Tuskegee Syphilis study, Congress passed the National Research Act. At Tuskegee, researchers followed African-American men with syphilis for decades and did not tell the subjects when penicillin became available as an effective treatment. The researchers feared that the subjects, if informed, would take the drug and be cured, ending the experiment.

Public outcry led to federal regulations governing research on humans, requiring informed consent. These rules pertain, by law, to all studies conducted using federal funds, but have been extended by essentially all universities and pharmaceutical and biotech companies in this country to cover all research on humans, becoming the universally-accepted standard.

According to these regulations, all research must respect the rights of individual research subjects, and scientific investigators must therefore explain to participants the purposes of the study, describe the procedures (and which of these are experimental) and “any reasonably foreseeable risks or discomforts.”

Facebook followed none of these mandates. The company has argued that the study was permissible because the website’s data use policy states, “we may use the information we receive about you…for internal operations, including troubleshooting, data analysis, testing, research and service improvement,” and that “we may make friend suggestions, pick stories for your News Feed or suggest people to tag in photos.”

But while the company is not legally required to follow this law, two of the study’s three authors are affiliated with universities – Cornell and the University of California at San Francisco – that publicly uphold this standard.

The National Research Act led to the establishment of local research ethics committees, known as Institutional Review Boards (or IRBs), which can waive the informed consent requirement in certain instances, provided, “whenever appropriate, the subjects will be provided with additional pertinent information after participation” – that is, researchers should “debrief” the participants afterwards.

Such a debriefing apparently did not occur here, but easily could have. Facebook said it reviewed the research internally, but there is no evidence that that review was by an IRB or met the standards of the federal regulations.

Moreover, the journal, PNAS, mandates that “all experiments have been conducted according to the principles expressed in the Declaration of Helsinki,” which also dictates that subjects be informed of the study’s “aims, methods…and the discomfort it may entail.”

The lead author, Adam Kramer, apologized on Facebook, writing, “my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.” But that statement falls far short. The problem is not only how the study was described, but how it was conducted.

Many researchers try to avoid having to obtain appropriate informed consent, since they worry that potential subjects, if asked, would refuse to participate. Pharmaceutical, insurance and Internet companies and others are increasingly studying us, acquiring massive amounts of data about us – about not only our Internet use, but our genomes and medical records. Many medical centers are building enormous biobanks. Countless websites now examine our behavior online. They ask us to scroll down and click “I accept,” assuming we’re unlikely to read the dense legalese and simply accept their terms.

In July 2011, President Obama released proposals to improve the current system of oversight on human research. The federal Office of Human Research Protections received public comments for a few months but appears to have put this on the back burner.

Social scientists have complained that the current regulations are onerous and that their research should be excused from IRB review.

The current system is overly bureaucratic and needs reform. But as this controversial Facebook experiment suggests, it should not be scrapped.

Good experiments benefit society. But in their zeal to conduct research, some social scientists overlook how their studies may impinge on people’s rights. As the amount of research on humans continues to grow, more violations will probably occur. We should try to avoid as much as possible becoming human guinea pigs.

Read CNNOpinion’s new Flipboard magazine

Follow us on Twitter @CNNOpinion.

Join us on Facebook.com/CNNOpinion.