Publikation: Teachers’ judgment accuracy : a replication check by psychometric meta-analysis
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
DOI (zitierfähiger Link)
Internationale Patentnummer
Link zur Lizenz
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
Teachers’ judgment accuracy is a core competency in their daily business. Due to its importance, several meta-analyses have estimated how accurately teachers judge students’ academic achievements by measuring teachers’ judgment accuracy (i.e., the correlation between teachers’ judgments of students’ academic abilities and students’ scores on achievement tests). In our study, we considered previous meta-analyses and updated these databases and the analytic combination of data using a psychometric meta-analysis to explain variations in results across studies. Our results demonstrate the importance of considering aggregation and publication bias as well as correcting for the most important artifacts (e.g., sampling and measurement error), but also that most studies fail to report the data needed for conducting a meta-analysis according to current best practices. We find that previous reviews have underestimated teachers’ judgment accuracy and overestimated the variance in estimates of teachers’ judgment accuracy across studies because at least 10% of this variance may be associated with common artifacts. We conclude that ignoring artifacts, as in classical meta-analysis, may lead one to erroneously conclude that moderator variables, instead of artifacts, explain any variation. We describe how online data repositories could improve the scientific process and the potential for using psychometric meta-analysis to synthesize results and assess replicability.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
KAUFMANN, Esther, 2024. Teachers’ judgment accuracy : a replication check by psychometric meta-analysis. In: PLOS ONE. Public Library of Science (PLoS). 2024, 19(7), e0307594. eISSN 1932-6203. Verfügbar unter: doi: 10.1371/journal.pone.0307594BibTex
@article{Kaufmann2024-07-25Teach-70585, year={2024}, doi={10.1371/journal.pone.0307594}, title={Teachers’ judgment accuracy : a replication check by psychometric meta-analysis}, number={7}, volume={19}, journal={PLOS ONE}, author={Kaufmann, Esther}, note={Article Number: e0307594} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/70585"> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/70585"/> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/> <dcterms:issued>2024-07-25</dcterms:issued> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-08-15T13:20:58Z</dcterms:available> <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/70585/1/Kaufmann_2-12l5frzgmk1096.pdf"/> <dcterms:title>Teachers’ judgment accuracy : a replication check by psychometric meta-analysis</dcterms:title> <dcterms:abstract>Teachers’ judgment accuracy is a core competency in their daily business. Due to its importance, several meta-analyses have estimated how accurately teachers judge students’ academic achievements by measuring teachers’ judgment accuracy (i.e., the correlation between teachers’ judgments of students’ academic abilities and students’ scores on achievement tests). In our study, we considered previous meta-analyses and updated these databases and the analytic combination of data using a psychometric meta-analysis to explain variations in results across studies. Our results demonstrate the importance of considering aggregation and publication bias as well as correcting for the most important artifacts (e.g., sampling and measurement error), but also that most studies fail to report the data needed for conducting a meta-analysis according to current best practices. We find that previous reviews have underestimated teachers’ judgment accuracy and overestimated the variance in estimates of teachers’ judgment accuracy across studies because at least 10% of this variance may be associated with common artifacts. We conclude that ignoring artifacts, as in classical meta-analysis, may lead one to erroneously conclude that moderator variables, instead of artifacts, explain any variation. We describe how online data repositories could improve the scientific process and the potential for using psychometric meta-analysis to synthesize results and assess replicability.</dcterms:abstract> <dc:language>eng</dc:language> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:rights>Attribution 4.0 International</dc:rights> <dc:contributor>Kaufmann, Esther</dc:contributor> <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/70585/1/Kaufmann_2-12l5frzgmk1096.pdf"/> <dc:creator>Kaufmann, Esther</dc:creator> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-08-15T13:20:58Z</dc:date> </rdf:Description> </rdf:RDF>