background preloader

Reproducibilidad de la ciencia

Facebook Twitter

U.S. science officials take aim at shoddy studies. By Sharon Begley| NEW YORK NEW YORK Jan 27 After years in which a handful of scientists raised alarms about biomedical studies that cannot be independently confirmed, raising the possibility that much of the research literature has been compromised by widespread incompetence or even deception, top U.S. science officials are acknowledging the crisis of "irreproducibility.

U.S. science officials take aim at shoddy studies

" "We have to take this seriously," Dr Francis Collins, director of the National Institutes of Health (NIH), the country's largest funder of basic biomedical research, told Reuters in an interview, citing the waste of time and money spent trying to build on studies whose results are a mirage or whose methods are too opaque for others to follow. In an essay in the journal Nature published on Monday, Collins warned that "the checks and balances that once ensured scientific fidelity have been hobbled," and outlined steps NIH will take to combat the "non-replication" problem.

C. Reproducibility in Chemical Research - Bergman - 2016 - Angewandte Chemie International Edition. “… To what extent is reproducibility a significant issue in chemical research?

Reproducibility in Chemical Research - Bergman - 2016 - Angewandte Chemie International Edition

How can problems involving irreproducibility be minimized? … Researchers should be aware of the dangers of unconscious investigator bias, all papers should provide adequate experimental detail, and Reviewers have a responsibility to carefully examine papers for adequacy of experimental detail and support for the conclusions …” Read more in the Editorial by Robert G. Bergman and Rick L. Danheiser. Promises and pitfalls of data sharing in qualitative research.

A Chester M.

Promises and pitfalls of data sharing in qualitative research

Pierce, MD Division of Global Psychiatry, Massachusetts General Hospital, Boston, USAb Harvard Center for Population and Development Studies, Cambridge, USAc Mbarara University of Science and Technology, Mbarara, Ugandad Duke Global Health Institute, Duke University, Durham, USAe Department of Global Health and Population, Harvard T.H. Chan School of Public Health, Boston, USAf Department of Sociology, University of Toronto, Toronto, Canadag Department of Sociology, Yale University, New Haven, USAh Department of Medicine, University of California at San Francisco, California, USAi Department of Social and Behavioral Sciences, School of Nursing, University of California at San Francisco, San Francisco, USA Received 20 May 2016, Revised 30 July 2016, Accepted 2 August 2016, Available online 9 August 2016 Choose an option to locate/access this article: Check if you have access through your login credentials or your institution Check access doi:10.1016/j.socscimed.2016.08.004.

MMS: Error. Challenges in irreproducible research. Reality check on reproducibility. Is there a reproducibility crisis in science?

Reality check on reproducibility

Yes, according to the readers of Nature. Two-thirds of researchers who responded to a survey by this journal said that current levels of reproducibility are a major problem. The ability to reproduce experiments is at the heart of science, yet failure to do so is a routine part of research. 1,500 scientists lift the lid on reproducibility. More than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments.

1,500 scientists lift the lid on reproducibility

Those are some of the telling figures that emerged from Nature's survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research. The data reveal sometimes-contradictory attitudes towards reproducibility. Although 52% of those surveyed agree that there is a significant 'crisis' of reproducibility, less than 31% think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature.

Data on how much of the scientific literature is reproducible are rare and generally bleak. Repetitive flaws. From next week, scientists who submit grant applications to the US National Institutes of Health (NIH) will be asked to take a little more care.

Repetitive flaws

As part of an increasing drive to boost the reliability of research, the NIH will require applicants to explain the scientific premise behind their proposals and defend the quality of their experimental designs. Let’s think about cognitive bias. “Ever since I first learned about confirmation bias I’ve been seeing it everywhere.”

Let’s think about cognitive bias

So said British author and broadcaster Jon Ronson in So You’ve Been Publicly Shamed (Picador, 2015). You will see a lot of cognitive bias in this week’s Nature. In a series of articles, we examine the impact that bias can have on research, and the best ways to identify and tackle it. One enemy of robust science is our humanity — our appetite for being right, and our tendency to find patterns in noise, to see supporting evidence for what we already believe is true, and to ignore the facts that do not fit. The sources and types of such cognitive bias — and the fallacies they produce — are becoming more widely appreciated. Journals unite for reproducibility. Reproducibility, rigour, transparency and independent verification are cornerstones of the scientific method.

Journals unite for reproducibility

Of course, just because a result is reproducible does not make it right, and just because it is not reproducible does not make it wrong. A transparent and rigorous approach, however, will almost always shine a light on issues of reproducibility. This light ensures that science moves forward, through independent verifications as well as the course corrections that come from refutations and the objective examination of the resulting data. It was with the goal of strengthening such approaches in the biomedical sciences that a group of editors representing more than 30 major journals; representatives from funding agencies; and scientific leaders assembled at the American Association for the Advancement of Science’s headquarters in June 2014 to discuss principles and guidelines for preclinical biomedical research.

Reproducibility will not cure what ails science. Leaders of the scientific community, nudged by the media (including Nature), are acknowledging that a culture of science focused on rewarding eye-catching and positive findings may have resulted in major bodies of knowledge that cannot be reproduced.

Reproducibility will not cure what ails science

Private-sector, academic and non-profit groups are leading multiple efforts to replicate selected published findings, and so far the results do not make happy reading. Several high-profile endeavours have been unable to reproduce the large majority of peer-reviewed studies that they examined. Meanwhile, the US National Academies is preparing to publish a high-profile report on scientific integrity that will flag irreproducibility as a key concern for the research enterprise. As the spotlight shines on reproducibility, uncomfortable issues will emerge at the interface of research and 'evidence-based' policy. Precisely. “Quality assurance will increasingly become a matter of political interpretation.”

Reproducibility: A tragedy of errors. Illustration by David Parkins Just how error-prone and self-correcting is science?

Reproducibility: A tragedy of errors

We have spent the past 18 months getting a sense of that. We are a group of researchers working on obesity, nutrition and energetics. In the summer of 2014, one of us (D.B.A.) read a research paper in a well-regarded journal estimating how a change in fast-food consumption would affect children's weight, and he noted that the analysis applied a mathematical model that overestimated effects by more than tenfold. We and others submitted a letter1 to the editor explaining the problem. Robust research: Institutions must do their part for reproducibility.

Illustration by David Parkins Irreproducible research poses an enormous burden: it delays treatments, wastes patients' and scientists' time, and squanders billions of research dollars. It is also widespread. An unpublished 2015 survey by the American Society for Cell Biology found that more than two-thirds of respondents had on at least one occasion been unable to reproduce published results.

Policy: NIH plans to enhance reproducibility. Chris Ryan/Nature A growing chorus of concern, from scientists and laypeople, contends that the complex system for ensuring the reproducibility of biomedical research is failing and is in need of restructuring1, 2. As leaders of the US National Institutes of Health (NIH), we share this concern and here explore some of the significant interventions that we are planning.

Science has long been regarded as 'self-correcting', given that it is founded on the replication of earlier work. Over the long term, that principle remains true. In the shorter term, however, the checks and balances that once ensured scientific fidelity have been hobbled. Let's be clear: with rare exceptions, we have no evidence to suggest that irreproducibility is caused by scientific misconduct. Instead, a complex array of other factors seems to have contributed to the lack of reproducibility. Reproducibility: The risks of the replication drive. Every once in a while, one of my postdocs or students asks, in a grave voice, to speak to me privately.

With terror in their eyes, they tell me that they have been unable to replicate one of my laboratory's previous experiments, no matter how hard they try. Replication is always a concern when dealing with systems as complex as the three-dimensional cell cultures routinely used in my lab. But with time and careful consideration of experimental conditions, they, and others, have always managed to replicate our previous data. Articles in both the scientific and popular press1–3 have addressed how frequently biologists are unable to repeat each other's experiments, even when using the same materials and methods.

But I am concerned about the latest drive by some in biology to have results replicated by an independent, self-appointed entity that will charge for the service. U.S. science officials take aim at shoddy studies. Replication and contradiction of highly cited research papers in psychiatry: 10-year follow-up. Estimating the reproducibility of psychological science. Empirically analyzing empirical evidence One of the central goals in any scientific endeavor is to understand causality. Experiments that seek to demonstrate a cause/effect relation most often manipulate the postulated causal factor. Aarts et al. describe the replication of 100 experiments reported in papers published in 2008 in three high-ranking psychology journals.

Assessing whether the replication and the original experiment yielded the same result according to several criteria, they find that about one-third to one-half of the original findings were also observed in the replication study. Science, this issue 10.1126/science.aac4716 Structured Abstract. Retraction Watch - Tracking retractions as a window into the scientific process at Retraction Watch.

Publication bias. Publication bias is a type of bias with regard to what academic research is likely to be published, among what is available to be published. Publication bias is of interest because literature reviews of claims about support for a hypothesis, or values for a parameter will themselves be biased if the original literature is contaminated by publication bias.[1] While some preferences are desirable—for instance a bias against publication of flawed studies—a tendency of researchers and journal editors to prefer some outcomes rather than others (e.g., results showing a significant finding) leads to a problematic bias in the published literature.[2] Attempts to identify unpublished studies often prove difficult or are unsatisfactory.[1] One effort to decrease this problem is reflected in the move by some journals to require that studies submitted for publication are pre-registered (registering a study prior to collection of data and analysis).

Definition[edit] Evidence[edit] [edit] Examples[edit] Self-correction in science at work. Week after week, news outlets carry word of new scientific discoveries, but the media sometimes give suspect science equal play with substantive discoveries. Careful qualifications about what is known are lost in categorical headlines. Science is broken. These academics think they have the answer. Splashy science frauds usually spark conversations about the fact that science sometimes fails. But what often gets missed are the decidedly less sexy structural flaws within science — from publication bias (the fact that the studies that end up published tend to have positive results) to the lack of replication (or the attempt to validate previous findings by reproducing experiments) and transparency.

TOP Guidelines. Solving reproducibility. Research papers: Journals should drive data reproducibility : Nature : Nature Research. Peer-reviewed journals — as well as researchers and their funders — must take responsibility for improving the reproducibility of published results (see Nature 533, 452–454; 2016). I suggest that journals should be required to sign a global statement indicating that, to the best of their knowledge, the data that they publish are reproducible. This statement would be collaboratively formulated by the editors-in-chief in accordance with recommendations from the International Committee of Medical Journal Editors and guidelines proposed by the US National Institutes of Health, Nature and Science (see Nature 515, 7; 2014 and go.nature.com/29bxphv).

Science under Societal Scrutiny: Reproducibility in Climate Science - Reproducibility: Principles, Problems, Practices, and Prospects: Principles, Problems, Practices, and Prospects - Feulner. Reproducibility is often considered one of the hallmarks of the modern scientific method. For climate science, the difficulties and demands appear even more pronounced than for other fields within science. The main difficulties arise from the complexity of the system and from the fact that there is only one specimen of planet Earth which cannot be studied under controlled conditions in the laboratory.

In that sense climate science is indeed under closer societal scrutiny than other scientific disciplines. This chapter discusses some of the problems related to climate data highlighting the challenges for reproducibility for observational climate science, climate modeling, and paleoclimate research. The reproducibility challenge for data-based climate science is very similar to any other scientific discipline working with data, and the standard solutions to these problems apply for climate science as well.

The Quest for Reproducibility Viewed in the Context of Innovation Societies - Reproducibility: Principles, Problems, Practices, and Prospects: Principles, Problems, Practices, and Prospects - Maasen. This chapter reframes the issue of reproducibility in science and technology within the context of contemporary knowledge societies that are characterized by a constant quest for innovation. Reproducibility, Objectivity, Invariance - Reproducibility: Principles, Problems, Practices, and Prospects: Principles, Problems, Practices, and Prospects - Tetens. Reproducibility of Experiments: Experimenters' Regress, Statistical Uncertainty Principle, and the Replication Imperative - Reproducibility: Principles, Problems, Practices, and Prospects: Principles, Problems, Practices, and Prospects - Collins.

Science is often flawed. It's time we embraced that. In his book Derailed, about his fall from academic grace, the Dutch psychologist Diederik Stapel explained his preferred method for manipulating scientific data in detail that would make any nerd's jaw drop: "I preferred to do it at home, late in the evening... Problems with scientific research: How science goes wrong. The new scientific revolution: Reproducibility at last. How scientists fool themselves – and how they can stop. Illustration by Dale Edwin Murray.