Jesse Singal at New York Magazine’s The Science of Us dives deep on the recent faked data scandal: now-debunked research showing that a conversation with a gay political canvasser could change a person’s opinion on same-sex marriage. The study was so highly regarded that Science magazine published it in December 2014.

However, it appears that one of the study’s authors, UCLA grad student Michael LaCour, took data from the 2012 Cooperative Campaign Analysis Project (CCAP), falsified it, and then passed it off as his own, original data, to create the illusion of a proven effect, where there truly was none.

In New York Magazine, David Broockman, the Stanford researcher who brought the scandal to light, suggests an ongoing “post-publication peer review” for finding and exposing data errors that might have eluded a study’s initial peer reviewers.

Broockman faced serious qualms about exposing the irregularities. First, in the early stages of his evaluation, before he had what he felt was conclusive proof of the deception, there was the damage that it might do to a potentially-innocent researcher’s career. Brockman had no legitimate forum for questioning data, and  for gaining valuable assistance from other researchers, without casting a permanent shadow upon the questioned.

Then, there was the possibility of his own professional repercussions, if the exposure were interpreted as professional jealousy rather than straightforward academic questioning and correction. Payback, Singal implied, would be unpleasant.

The alternative, however, was to let potentially bad research stand, distorting real-world decision-making and spending – an unenviable dilemma.

In the end, with what he felt was incontrovertible evidence of the deception in hand, Broockman took a professional risk and went public with the accusation. The Science article, which had spurred coverage in the New York Times and on This American Life, was quickly retracted.

To prevent this same conundrum from befalling others, Broockman proposes a “post-publication peer review.” This would give academics a dedicated forum, along with a normal cultural expectation, to review data in published studies, similar to the way Facebook invites feedback from the public on potential security weaknesses in their existing applications. The idea is to embed ongoing, continuous evaluation in the peer-review system, providing a legitimate outlet for other academics to question, and potentially debunk, previously-published and peer-reviewed studies.

Not only would this relieve pressure on well-intentioned academics, it would also raise the quality of research in general. Those who might consider faking data, hoping to slip through the pre-publication peer review gates into the safe, unimpeachable territory of “published peer-reviewed study”, would be deterred by the prospect of ongoing, par-for-the-course review by their peers.

Create a system that discourages bad data from getting released in the first place? Yes, please.

______

New York Magazine’s The Science of Us: The Case of the Amazing Gay-Marriage Data: How a Graduate Student Reluctantly Uncovered a Huge Scientific Fraud

UC Berkeley: Irregularities in LaCour (2014)

Science: When contact changes minds: An experiment on transmission of support for gay equality (abstract only available)

Contact Us

Ready to talk about your project? Reach out to us to set up a conversation at contact@halcyonnw.com or 360.386.5645.

Thank you!