0 Datasets
0 Files
Get instant academic access to this publication’s datasets.
Yes. After verification, you can browse and download datasets at no cost. Some premium assets may require author approval.
Files are stored on encrypted storage. Access is restricted to verified users and all downloads are logged.
Yes, message the author after sign-up to request supplementary files or replication code.
Join 50,000+ researchers worldwide. Get instant access to peer-reviewed datasets, advanced analytics, and global collaboration tools.
✓ Immediate verification • ✓ Free institutional access • ✓ Global collaborationJoin our academic network to download verified datasets and collaborate with researchers worldwide.
Get Free Accesshat proportion of published research is likely to be false?Low sample size, small effect sizes, data dredging (also known as P-hacking), conflicts of interest, large numbers of scientists working competitively in silos without combining their efforts, and so on, may conspire to dramatically increase the probability that a published finding is incorrect 1 .The field of metascience -the scientific study of science itself -is flourishing and has generated substantial empirical evidence for the existence and prevalence of threats to efficiency in knowledge accumulation (refs 2-7; Fig. 1).Data from many fields suggests reproducibility is lower than is desirable [8][9][10][11][12][13][14] ; one analysis estimates that 85% of biomedical research efforts are wasted 14 , while 90% of respondents to a recent survey in Nature agreed that there is a 'reproducibility crisis' 15 .Whether 'crisis' is the appropriate term to describe the current state or trajectory of science is debatable, but accumulated evidence indicates that there is substantial room for improvement with regard to research practices to maximize the efficiency of the research community's use of the public's financial investment in research.Here we propose a series of measures that we believe will improve research efficiency and robustness of scientific findings by directly targeting specific threats to reproducible science.We argue for the adoption, evaluation and ongoing improvement of these measures to optimize the pace and efficiency of knowledge accumulation.The measures are organized into the following categories 16 : methods, reporting and dissemination, reproducibility, evaluation and incentives.They are not intended to be exhaustive, but provide a broad, practical and evidence-based set of actions that can be implemented by researchers, institutions, journals and funders.The measures and their current implementation are summarized in Table 1.
Marcus R. Munafò, Brian A. Nosek, Dorothy Bishop, Katherine S. Button, Chris Chambers, Nathalie Percie du Sert, Uri Simonsohn, Eric‐Jan Wagenmakers, Jennifer J. Ware, John P A Ioannidis (2024). A manifesto for reproducible science.DOI: https://doi.org/10.1037/0000409-041,
Datasets shared by verified academics with rich metadata and previews.
Authors choose access levels; downloads are logged for transparency.
Students and faculty get instant access after verification.
Type
Chapter in a book
Year
2024
Authors
10
Datasets
0
Total Files
0
Language
en
DOI
https://doi.org/10.1037/0000409-041
Access datasets from 50,000+ researchers worldwide with institutional verification.
Get Free Access