background preloader

The existential quest of psychological science for its soul

Facebook Twitter

Real Data Are Messy - Neuroskeptic. Over at the sometimes i’m wrong blog, psychologist Michael Inzlicht tells A Tale of Two Papers.

Real Data Are Messy - Neuroskeptic

Inzlicht describes how, as associate editor at the Journal of Experimental Psychology: General, he rejected a certain manuscript. He did so despite the fact that the peer review reports had been very positive. The article reported 7 studies, all of which found nice, statistically significant evidence for the hypothesis in question. Can Psychology Be an Empirical Science? - Neuroskeptic. In a provocative new paper, Norwegian psychologist Jan Smedslund argues that psychology “cannot be an empirical science”.

Can Psychology Be an Empirical Science? - Neuroskeptic

Smedslund is a veteran of the field; his first paper was published in 1953. He opens by saying that Psychology is a science in crisis, both with respect to theoretical coherence and practical efficiency. Social Priming: Money for Nothing? - Neuroskeptic. Can the thought of money make people more conservative?

Social Priming: Money for Nothing? - Neuroskeptic

The idea that mere reminders of money can influence people’s attitudes and behaviors is a major claim within the field of social priming – the study of how our behavior is unconsciously influenced by seemingly innocuous stimuli. However, social priming has been controversial lately with many high profile failures to replicate the reported effects. Now, psychologists Doug Rohrer, Hal Pashler, and Christine Harris have joined the skeptical fray, in a paper soon to be published in the Journal of Experimental Psychology: General (JEPG) (preprint here). Rohrer et al. report zero evidence for money-priming effects across four large experiments. They conclude that “Although replication failures should be interpreted with caution, the sheer number of so many high-powered replication failures cast doubt on the money priming effects reported…”

Don’t call it a comeback. The Reproducibility Project, the giant study to re-run experiments reported in three top psychology journals, has just published its results and it’s either a disaster, a triumph or both for psychology.

Don’t call it a comeback

You can’t do better than the coverage in The Atlantic, not least as it’s written by Ed Yong, the science journalist who has been key in reporting on, and occasionally appearing in, psychology’s great replication debates. Two important things have come out of the Reproducibility Project. The first is that psychologist, project leader and now experienced cat-herder Brian Nosek deserves some sort of medal, and his 270-odd collaborators should be given shoulder massages by grateful colleagues. It’s been psychology’s equivalent of the large hadron collider but without the need to dig up half of Switzerland. When looking at replication by subject area, studies on cognitive psychology were more likely to reproduce than studies from social psychology.

Three Popular Psychology Studies That Didn't Hold Up. Philip Zimbardo has a theory. “Boys risk becoming addicted to porn, video games and Ritalin” says psychologist Philip Zimbardo, which simply isn’t true, because some weekends I read.

Philip Zimbardo has a theory

Yes, Zimbardo has a theory which says that masculinity is being damaged by computer games, the internet, and pornography without an adequate plot line. A key solution: dancing. Andrew Przybylski/Phil Zimbardo debate video game effects. Terry Burnham: A Trick For Higher SAT scores? Unfortunately no. Wouldn’t it be cool if there was a simple trick to score better on college entrance exams like the SAT and other tests?

Terry Burnham: A Trick For Higher SAT scores? Unfortunately no.

There is a reputable claim that such a trick exists. Unfortunately, the trick does not appear to be real. In the spring of 2012, I was reading Nobel Laureate Daniel Kahneman’s book, Thinking, Fast and Slow. Professor Kahneman discussed an intriguing finding that people score higher on a test if the questions are hard to read. The particular test used in the study is the CRT or cognitive reflection task invented by Shane Frederick of Yale. . “90% of the students who saw the CRT in normal font made at least one mistake in the test, but the proportion dropped to 35% when the font was barely legible. I thought this was so cool. Malcolm Gladwell also thought the result was cool. A puzzle about the latest test ban (or ‘don’t ask, don’t tell’) A large number of people have sent me articles on the “test ban” of statistical hypotheses tests and confidence intervals at a journal called Basic and Applied Social Psychology (BASP)[i].

A puzzle about the latest test ban (or ‘don’t ask, don’t tell’)

Enough. One person suggested that since it came so close to my recent satirical Task force post, that I either had advance knowledge or some kind of ESP. Oh please, no ESP required.None of this is the slightest bit surprising, and I’ve seen it before; I simply didn’t find it worth blogging about. Surrogate Science. No scientific worker has a fixed level of significance at which from year to year, and in all circumstances, he rejects hypotheses; he rather gives his mind to each particular case in the light of his evidence and his ideas.

Surrogate Science

Rolf Zwaan: When Replicating Stapel is not an Exercise in Futility. Over 50 of Diederik Stapel’s papers have been retracted because of fraud.

Rolf Zwaan: When Replicating Stapel is not an Exercise in Futility

This means that his “findings,” have now ceased to exist in the literature. But what does this mean for his hypotheses? Why We May Never Beat Stigma. When public figures want to display penitence for their bad choices—see under "Woods, Tiger" and "Gibson, Mel"—they go to rehab.

Why We May Never Beat Stigma

Whether the problem is extramarital affairs, plagiarism or even racism, crying addiction has become an all-purpose excuse. This month saw the “Lying Dutchman”—a top social psychologist who was found to have published over 55 fraudulent academic papers, including one in the prestigious journal Science—release a memoir calling his data fakery an addiction. At the same time, another columnist went as far as to blame conflicts of interest in medical research on doctors’ “addiction” to taking money from Big Pharma. I’m tempted to call the problem here an addiction to addiction, but that would make matters even worse. Labeling any type of bad behavior that anyone seeks absolution for an addiction makes the term completely meaningless. [17] No-way Interactions.

This post shares a shocking and counterintuitive fact about studies looking at interactions where effects are predicted to get smaller (attenuated interactions). I needed a working example and went with Fritz Strack et al.’s (1988, .pdf) famous paper [933 Google cites], in which participants rated cartoons as funnier if they saw them while holding a pen with their lips (inhibiting smiles) vs. their teeth (facilitating them). The paper relies on a sensible and common tactic: Show the effect in Study 1. Then in Study 2 show that a moderator makes it go away or get smaller.

Sometimes i'm wrong: this is what p-hacking looks like. Keri russell plotting her next QRP i am teaching a seminar called 'oh you like that finding do you? Sweet dreams. Words and sorcery. Back in 1971 Stanislav Andreski’s Social Sciences as Sorcery slammed academics for their inability to write clearly. There was, he argued, an ‘abundance of pompous bluff and paucity of new ideas’, a use of ‘obfuscating jargon’ to conceal a lack of anything to say.

This was, Andreski argued, another reflection of modern society’s ‘advanced stage of cretinization’. [19] Fake Data: Mendel vs. Stapel. Diederik Stapel, Dirk Smeesters, and Lawrence Sanna published psychology papers with fake data. They each faked in their own idiosyncratic way, nevertheless, their data do share something in common. Real data are noisy. Theirs aren’t. Linguistic Traces of a Scientific Fraud: The Case of Diederik Stapel. When scientists report false data, does their writing style reflect their deception? In this study, we investigated the linguistic patterns of fraudulent (N = 24; 170,008 words) and genuine publications (N = 25; 189,705 words) first-authored by social psychologist Diederik Stapel. The analysis revealed that Stapel's fraudulent papers contained linguistic changes in science-related discourse dimensions, including more terms pertaining to methods, investigation, and certainty than his genuine papers. His writing style also matched patterns in other deceptive language, including fewer adjectives in fraudulent publications relative to genuine publications.

Using differences in language dimensions we were able to classify Stapel's publications with above chance accuracy. Figures. Did a five-day camp without digital devices really boost children's interpersonal skills? Curtain up on second act for Dutch fraudster Stapel: College teacher. Misjudgements will drive social trials underground. Trendspotter: The brain-scan job interview.

[21] Fake-Data Colada. Scientific method: Statistical errors. Psychology Today apparently retracts Kanazawa piece on why black women are “rated less physically attractive” The British amateur who debunked the mathematics of happiness. Nick Brown does not look like your average student.

Is social psychology really in crisis? The headlines Disputed results a fresh blow for social psychology. Journal Impact Factors. Every summer my e-mail is enlivened by people and organizations writing about the latest journal impact factors (IF). Research finds 'US effect' exaggerates results in human behaviour studies. Scientists who study human behaviour are more likely than average to report exaggerated or eye-catching results if they are based in the United States, according to an analysis of more than 1,000 research papers in psychiatry and genetics. This bias could be due to the research culture in the US, authors of the analysis said, which tends to preferentially reward scientists for the novelty and immediate impact of a piece of work over the quality or its long-term contribution to the field.

Daniele Fanelli, University of Edinburgh, one of the authors of the latest analysis, said that there was intense competition in the US for research funds and, subsequently, pressure to report novel findings in prestigious, high-impact scientific journals. "We don't know what causes the US effect but we think the most likely explanation is that it's about the research environment in the US," he says. Fanelli worked with John Ioannidis of Stanford University on the study. Three Cheers for Failure! Most brain imaging papers fail to provide enough methodological detail to allow replication. Nobel laureate challenges psychologists to clean up their act. Crowdsourcing the psych lab. Sort of Significant: Are Psychology Papers Just Nipping Past the p Value. Neuroleadership – lots of old-fashioned psychology, very little neuroscience. Report: Dutch 'Lord of the Data' Forged Dozens of Studies.

University withdraws articles - News - Erasmus Universiteit Rotterdam. Fraud Detection Method Called Credible But Used Like an 'Instrument of Medieval Torture' The data detective. Uncertainty shrouds psychologist's resignation. Psychological Statistics: What's up with social psychology? Simonsohn’s Fraud Detection Technique Revealed » Random Assignment. Psychology is Science » Random Assignment. Replication studies: Bad copy.

An editorial board discusses fMRI analysis and “false-positive psychology” « The Hardest Science. Despite Occasional Scandals, Science Can Police Itself - Commentary. As Dutch Research Scandal Unfolds, Social Psychologists Question Themselves - Research. How PTSD took over America. Pseudoscience and the London Riots: Folk Psychology Run Amok. Discrimination Hurts Real People « YourMorals.Org Moral Psychology Blog. How 9/11 unearthed psychologists’ limits. Disgraced cognition researcher resigns from Harvard. Francis Galton: The man who drew up the 'ugly map' of Britain. How to survive in psychological research. July/August 2011 > Features > Stanford Prison Experiment. Evolution's Rainbow, from sparrows' stripes to lizard lesbianism. What are participants really up to when they complete an online questionnaire?

Nymphomania and the brain. - By Jesse Bering. The Curious Brain in the Museum - The 2010 Henry Cole Lecture. The Psychology of Science Politicization. David Eagleman: The human brain runs on conflict. Behavioral economics under attack. - By Tim Harford. Happiness index to gauge Britain's national mood. Is this evidence that we can see the future? - life - 11 November 2010. Manipulating morals: scientists target drugs that improve behaviour. If suspect Jared Lee Loughner has schizophrenia, would that make him more likely to go on a shooting spree in Arizona? - By Vaughan Bell. Being mad is one thing, going mad quite another. Evolutionary psychology and high standards of scientific research. The Trouble With Using Undergrads for Research.

Social Psychologists Detect Liberal Bias Within. Is Cognitive Science Full of Crap?