Uri Simonsohn and Finding Fake Data
Get flash to fully experience Pearltrees
A quick update on the Simonsohn’s Just Post It paper – Friday’s post focused mostly on the steps Simonsohn took to avoid making false accusations and provided a link to the newly available paper so people could check out the details for themselves. I thought it was worth adding a little bit of explanation for people who are curious about the details. Without going into great detail – people who are interested can obviously read the paper themselves – I think Simonsohn’s analysis of Smeesters’ study that used a Willingness to Pay (WTP) measure is easy to understand and illustrates the assertion that Smeesters is very unlikely to have just dropped a few participants who didn’t follow directions (or even eliminated a handful of inconvenient data points) as he claims to have done. It also makes a strong case for Simonsohn’s call to post raw data. Note that the analysis isn’t representative of the other analyses Simonsohn conducted.
Everyone's been talking about psychologist Uri Simonsohn and his role in the downfall of two scientific fraudsters. When the story first broke, the methods Simonsohn used that allowed him to spot the dodgy data were mysterious - which only added to the buzz. The paper revealing the approach is now up online and it's a must-read. It's not often a statistics paper offers the train-wrecky schadenfreude of watching two fraudsters' careers come to a well-deserved end. What's rather disturbing about the article, however, is that it doesn't really contain much that's new, in principle. Simonsohn used statistics to spot data in published papers that was, in effect, 'too good to be true'.
Blackout. On Monday, the university released a version of the report in which Simonsohn's name and method were redacted. Credit: Erasmus University Rotterdam Erasmus University Rotterdam in the Netherlands has identified Uri Simonsohn, an associate professor at the Wharton School of the University of Pennsylvania , as the whistleblower in the case of psychologist Dirk Smeesters . The university today released the unredacted report from the investigative committee that looked into allegations of misconduct against Smeesters.
Uri Simonsohn, the researcher who flagged up questionable data in studies by social psychologist Dirk Smeesters , has revealed the name of a second social psychologist whose data he believes to be suspiciously perfect. That researcher is Lawrence Sanna, whose former employer, the University of Michigan in Ann Arbor, tells Simonsohn that he resigned his professorship there at the end of May. The reasons for Sanna's resignation are not known, but it followed questions from Simonsohn and a review by Sanna’s previous institution, the University of North Carolina in Chapel Hill (UNC). According to the editor of the Journal of Experimental Social Psychology , Sanna has also asked that three of his papers be retracted from the journal. In both Smeesters’ and Sanna’s work, odd statistical patterns in the data raised concerns with Simonsohn, at the University of Pennsylvania in Philadelphia. But the similarity between the cases ends there.
University of Michigan psychologist resigns following concerns by statistical sleuth Simonsohn: NatureA second psychology researcher has resigned after statistical scrutiny of his papers by another psychologist revealed data that was too good to be true. Ed Yong, writing in Nature , reports that Lawrence Sanna, most recently of the University of Michigan, left his post at the end of May. That was several months after Uri Simonsohn , a University of Pennsylvania psychology researcher, presented Sanna, his co-authors, and Sanna’s former institution, the University of North Carolina, Chapel Hill, with evidence of “odd statistical patterns.”
Mathematical Statistics Statistics Group Mathematical Institute
Has Uri Simonsohn, a social psychologist at the Wharton School of the University of Pennsylvania, discovered a new way to detect scientific fraud just by subjecting the data in published papers to a novel type of statistical analysis? That's a question social psychologists and statisticians are asking after an investigative commission at Erasmus University Rotterdam in the Netherlands used his unpublished technique to probe the work of marketing researcher Dirk Smeesters—an inquiry that led to Smeesters's resignation and a request by the university to retract two of his papers . The social psychologist continues to deny that he has committed scientific fraud, however, and at least one statistician who has looked into Simonsohn's method says the technique appears to have merit, but was used in the wrong way.
Luke Church Photography Psychology was already under scrutiny following a series of high-profile controversies. Now it faces fresh questions over research practices that can sometimes produce eye-catching — but irreproducible — results. Last week, Erasmus University Rotterdam in the Netherlands said that social psychologist Dirk Smeesters had resigned after an investigation found that he had massaged data to produce positive outcomes in his research, such as the effect of colour on consumer behaviour 1 , 2 .
Last week, I wrote about the case of Dirk Smeesters, a social psychologist who had resigned from Erasmus University Rotterdam after an investigation uncovered problems with the data in two of his papers. His case follows the scandal of Diederik Stapel, another psychologist from a Dutch University who was found guilty of research fraud last year. As I noted in my post, the Smeesters case is unique.
AMSTERDAM —The most startling thing about the latest scandal to hit social psychology isn’t the alleged violation of scientific ethics itself, scientists say, or the fact that it happened in the Netherlands, the home of fallen research star and serial fraudster Diederik Stapel, whose case shook the field to its core less than a year ago. Instead, what fascinates them most is how the new case, which led to the resignation of psychologist Dirk Smeesters of Erasmus University Rotterdam and the requested retraction of two of his papers by his school, came to light: through an unpublished statistical method to detect data fraud. The technique was developed by Uri Simonsohn, a social psychologist at the Wharton School of the University of Pennsylvania, who tells Science that he has also notified a U.S. university of a psychology paper his method flagged.
Back in November, I wrote about the Diederik Stapel affair that was rocking the psychological community. Now, it’s all happening again – recently, Uri Simonsohn has found evidence suggesting that another Dutch social psychologist, Dirk Smeesters, has been tinkering with data to produce more desirable outcomes in his research. We’re still waiting on Simonsohn’s paper on the matter to be published, so I think it’s a little too soon to speculate on how he came across Smeesters’ work – some have already started to make melodramatic comments likening Simonsohn’s approach to medieval torture . Clearly, we need to be very cautious about using such approaches as a sledgehammer. We don’t want to end up in a situation whereby completely innocent researchers wrongly getting caught out. But there are also some tough questions that need to be asked about how and why this sort of behaviour is happening.
Uri Simonsohn’s “secret” paper describing the analyses he used to detect fraud in the Dirk Smeesters and Larry Sanna cases has now been submitted for publication and is available on SSRN. It’s titled “Just Post It: The Lesson from Two Cases of Fabricated Data Detected by Statistics Alone.” Simonsohn explains the analyses he used to detect and confirm the fraud and calls on journals to make the publication of raw data their default policy.
University of Pennsylvania - The Wharton School January 29, 2013 Abstract: I argue that requiring authors to post the raw data supporting their published results has, among many other benefits, that of making fraud much less likely to go undetected. I illustrate this point by describing two cases of fraud I identified exclusively through statistical analysis of reported means and standard deviations. Analyses of the raw data behind these provided invaluable confirmation of the initial suspicions, ruling out benign explanations (e.g., reporting errors, unusual distributions), identifying additional signs of fabrication, and also ruling out one of the suspected fraudster’s explanations for his anomalous results. Number of Pages in PDF File: 31 Keywords: Data transparency, fake data, science, judgment and decision making
Full interview with Dutch reporter Maarten Keulemans from deVolkskrant newspaper Note: Maarten sent me a set of questions which I answered in writing followed by a phone conversation (July 4 th , 2012). His edited interview was printed in Dutch in the deVolkskrant newspaper.
A record-breaking year for retractions in 2011, a new record for retractions by one person — what’s going on? Ivan will be a guest for a live chat with Science magazine today at 3 p.m. Eastern to discuss fraud, ethics, and retractions. Join him, University of Virginia psychologist Brian Nosek , and Science reporter Martin Enserink — who most recently reported on the case of Dirk Smeesters . To participate in the chat, click here . Like this:
Greg Francis & Publication Bias Detection