Overview of the ShARe/CLEF eHealth Evaluation Lab 2014
Overview of the ShARe/CLEF eHealth Evaluation Lab 2014
Loading...
Date
2014
Authors
Kelly, Liadh
Goeuriot, Lorraine
Suominen, Hanna
Leroy, Gondy
Mowery, Danielle L.
Velupillai, Sumithra
Chapman, Wendy W.
Martinez, David
Zuccon, Guido
Editors
Journal ISSN
Electronic ISSN
ISBN
Bibliographical data
Publisher
Series
URI (citable link)
DOI (citable link)
International patent number
Link to the license
EU project number
Project
Open Access publication
Collections
Title in another language
Publication type
Contribution to a conference collection
Publication status
Published in
Information Access Evaluation : Multilinguality, Multimodality, and Interaction ; 5th International Conference of the CLEF Initiative, CLEF 2014, Sheffield, UK, September 15-18, 2014. Proceedings / Evangelos Kanoulas ... (ed.). - Springer International Publishing, 2014. - (Lecture Notes in Computer Science ; 8685). - pp. 172-191. - ISBN 978-3-319-11381-4
Abstract
This paper reports on the 2nd ShARe/CLEFeHealth evaluation lab which continues our evaluation resource building activities for the medical domain. In this lab we focus on patients’ information needs as opposed to the more common campaign focus of the specialised information needs of physicians and other healthcare workers. The usage scenario of the lab is to ease patients and next-of-kins’ ease in understanding eHealth information, in particular clinical reports. The 1st ShARe/CLEFeHealth evaluation lab was held in 2013. This lab consisted of three tasks. Task 1 focused on named entity recognition and normalization of disorders; Task 2 on normalization of acronyms/abbreviations; and Task 3 on information retrieval to address questions patients may have when reading clinical reports. This year’s lab introduces a new challenge in Task 1 on visual-interactive search and exploration of eHealth data. Its aim is to help patients (or their next-of-kin) in readability issues related to their hospital discharge documents and related information search on the Internet. Task 2 then continues the information extraction work of the 2013 lab, specifically focusing on disorder attribute identification and normalization from clinical text. Finally, this year’s Task 3 further extends the 2013 information retrieval task, by cleaning the 2013 document collection and introducing a new query generation method and multilingual queries. De-identified clinical reports used by the three tasks were from US intensive care and originated from the MIMIC II database. Other text documents for Tasks 1 and 3 were from the Internet and originated from the Khresmoi project. Task 2 annotations originated from the ShARe annotations. For Tasks 1 and 3, new annotations, queries, and relevance assessments were created. 50, 79, and 91 people registered their interest in Tasks 1, 2, and 3, respectively. 24 unique teams participated with 1, 10, and 14 teams in Tasks 1, 2 and 3, respectively. The teams were from Africa, Asia, Canada, Europe, and North America. The Task 1 submission, reviewed by 5 expert peers, related to the task evaluation category of Effective use of interaction and targeted the needs of both expert and novice users. The best system had an Accuracy of 0.868 in Task 2a, an F1-score of 0.576 in Task 2b, and Precision at 10 (P@10) of 0.756 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development.
Summary in another language
Subject (DDC)
004 Computer Science
Keywords
Information Retrieval, Information Extraction, Information Visualisation, Evaluation, Medical Informatics, Test-set Generation, Text Classification, Text Segmentation
Conference
CLEF 2014 Conference, Sep 15, 2014 - Sep 18, 2014, Sheffield, UK
Review
undefined / . - undefined, undefined. - (undefined; undefined)
Cite This
ISO 690
KELLY, Liadh, Lorraine GOEURIOT, Hanna SUOMINEN, Tobias SCHRECK, Gondy LEROY, Danielle L. MOWERY, Sumithra VELUPILLAI, Wendy W. CHAPMAN, David MARTINEZ, Guido ZUCCON, João PALOTTI, 2014. Overview of the ShARe/CLEF eHealth Evaluation Lab 2014. CLEF 2014 Conference. Sheffield, UK, Sep 15, 2014 - Sep 18, 2014. In: EVANGELOS KANOULAS ..., , ed.. Information Access Evaluation : Multilinguality, Multimodality, and Interaction ; 5th International Conference of the CLEF Initiative, CLEF 2014, Sheffield, UK, September 15-18, 2014. Proceedings. Springer International Publishing, pp. 172-191. ISBN 978-3-319-11381-4. Available under: doi: 10.1007/978-3-319-11382-1_17BibTex
@inproceedings{Kelly2014Overv-29989, year={2014}, doi={10.1007/978-3-319-11382-1_17}, title={Overview of the ShARe/CLEF eHealth Evaluation Lab 2014}, number={8685}, isbn={978-3-319-11381-4}, publisher={Springer International Publishing}, series={Lecture Notes in Computer Science}, booktitle={Information Access Evaluation : Multilinguality, Multimodality, and Interaction ; 5th International Conference of the CLEF Initiative, CLEF 2014, Sheffield, UK, September 15-18, 2014. Proceedings}, pages={172--191}, editor={Evangelos Kanoulas ...}, author={Kelly, Liadh and Goeuriot, Lorraine and Suominen, Hanna and Schreck, Tobias and Leroy, Gondy and Mowery, Danielle L. and Velupillai, Sumithra and Chapman, Wendy W. and Martinez, David and Zuccon, Guido and Palotti, João} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/29989"> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2015-02-24T10:34:53Z</dcterms:available> <dc:creator>Goeuriot, Lorraine</dc:creator> <dc:contributor>Leroy, Gondy</dc:contributor> <dc:language>eng</dc:language> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dcterms:abstract xml:lang="eng">This paper reports on the 2nd ShARe/CLEFeHealth evaluation lab which continues our evaluation resource building activities for the medical domain. In this lab we focus on patients’ information needs as opposed to the more common campaign focus of the specialised information needs of physicians and other healthcare workers. The usage scenario of the lab is to ease patients and next-of-kins’ ease in understanding eHealth information, in particular clinical reports. The 1st ShARe/CLEFeHealth evaluation lab was held in 2013. This lab consisted of three tasks. Task 1 focused on named entity recognition and normalization of disorders; Task 2 on normalization of acronyms/abbreviations; and Task 3 on information retrieval to address questions patients may have when reading clinical reports. This year’s lab introduces a new challenge in Task 1 on visual-interactive search and exploration of eHealth data. Its aim is to help patients (or their next-of-kin) in readability issues related to their hospital discharge documents and related information search on the Internet. Task 2 then continues the information extraction work of the 2013 lab, specifically focusing on disorder attribute identification and normalization from clinical text. Finally, this year’s Task 3 further extends the 2013 information retrieval task, by cleaning the 2013 document collection and introducing a new query generation method and multilingual queries. De-identified clinical reports used by the three tasks were from US intensive care and originated from the MIMIC II database. Other text documents for Tasks 1 and 3 were from the Internet and originated from the Khresmoi project. Task 2 annotations originated from the ShARe annotations. For Tasks 1 and 3, new annotations, queries, and relevance assessments were created. 50, 79, and 91 people registered their interest in Tasks 1, 2, and 3, respectively. 24 unique teams participated with 1, 10, and 14 teams in Tasks 1, 2 and 3, respectively. The teams were from Africa, Asia, Canada, Europe, and North America. The Task 1 submission, reviewed by 5 expert peers, related to the task evaluation category of Effective use of interaction and targeted the needs of both expert and novice users. The best system had an Accuracy of 0.868 in Task 2a, an F1-score of 0.576 in Task 2b, and Precision at 10 (P@10) of 0.756 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development.</dcterms:abstract> <dc:contributor>Zuccon, Guido</dc:contributor> <dc:contributor>Palotti, João</dc:contributor> <dc:creator>Zuccon, Guido</dc:creator> <dc:creator>Leroy, Gondy</dc:creator> <dc:creator>Chapman, Wendy W.</dc:creator> <dc:creator>Velupillai, Sumithra</dc:creator> <dc:contributor>Chapman, Wendy W.</dc:contributor> <dcterms:issued>2014</dcterms:issued> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dcterms:title>Overview of the ShARe/CLEF eHealth Evaluation Lab 2014</dcterms:title> <dc:contributor>Velupillai, Sumithra</dc:contributor> <dc:contributor>Suominen, Hanna</dc:contributor> <dc:creator>Mowery, Danielle L.</dc:creator> <dc:rights>terms-of-use</dc:rights> <dc:creator>Martinez, David</dc:creator> <dc:creator>Palotti, João</dc:creator> <dc:contributor>Schreck, Tobias</dc:contributor> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dc:contributor>Kelly, Liadh</dc:contributor> <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/29989/1/Kelly_0-267233.pdf"/> <dc:contributor>Goeuriot, Lorraine</dc:contributor> <dc:creator>Kelly, Liadh</dc:creator> <dc:creator>Suominen, Hanna</dc:creator> <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/> <dc:contributor>Mowery, Danielle L.</dc:contributor> <dc:creator>Schreck, Tobias</dc:creator> <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/29989/1/Kelly_0-267233.pdf"/> <bibo:uri rdf:resource="http://kops.uni-konstanz.de/handle/123456789/29989"/> <dc:contributor>Martinez, David</dc:contributor> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2015-02-24T10:34:53Z</dc:date> </rdf:Description> </rdf:RDF>
Internal note
xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter
Examination date of dissertation
Method of financing
Comment on publication
Alliance license
Corresponding Authors der Uni Konstanz vorhanden
International Co-Authors
Bibliography of Konstanz
Yes