Publikation: Estimating the reproducibility of psychological science
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
DOI (zitierfähiger Link)
Internationale Patentnummer
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
NOSEK, Brian A., Tim KUHLMANN, Stefan STIEGER, 2015. Estimating the reproducibility of psychological science. In: Science. 2015, 349(6251), aaac4716. ISSN 0036-8075. eISSN 1095-9203. Available under: doi: 10.1126/science.aac4716BibTex
@article{Nosek2015Estim-32994, year={2015}, doi={10.1126/science.aac4716}, title={Estimating the reproducibility of psychological science}, number={6251}, volume={349}, issn={0036-8075}, journal={Science}, author={Nosek, Brian A. and Kuhlmann, Tim and Stieger, Stefan}, note={Auf der Seite 943 gibt es eine Zusammenfassung zu diesem Artikel. Article Number: aaac4716} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/32994"> <dcterms:title>Estimating the reproducibility of psychological science</dcterms:title> <dc:creator>Stieger, Stefan</dc:creator> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2016-02-17T07:18:02Z</dcterms:available> <dc:contributor>Kuhlmann, Tim</dc:contributor> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/28"/> <dc:creator>Kuhlmann, Tim</dc:creator> <dcterms:issued>2015</dcterms:issued> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/28"/> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2016-02-17T07:18:02Z</dc:date> <dc:contributor>Nosek, Brian A.</dc:contributor> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/32994"/> <dcterms:abstract xml:lang="eng">Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.</dcterms:abstract> <dc:creator>Nosek, Brian A.</dc:creator> <dc:contributor>Stieger, Stefan</dc:contributor> <dc:language>eng</dc:language> </rdf:Description> </rdf:RDF>