Crowdsourcing Quality of Experience Experiments
| dc.contributor.author | Egger-Lampl, Sebastian | |
| dc.contributor.author | Redi, Judith | |
| dc.contributor.author | Hoßfeld, Tobias | |
| dc.contributor.author | Hirth, Matthias | |
| dc.contributor.author | Möller, Sebastian | |
| dc.contributor.author | Naderi, Babak | |
| dc.contributor.author | Keimel, Christian | |
| dc.contributor.author | Saupe, Dietmar | |
| dc.date.accessioned | 2018-02-02T12:11:32Z | |
| dc.date.available | 2018-02-02T12:11:32Z | |
| dc.date.issued | 2017-09-28 | eng |
| dc.description.abstract | Crowdsourcing enables new possibilities for QoE evaluation by moving the evaluation task from the traditional laboratory environment into the Internet, allowing researchers to easily access a global pool of workers for the evaluation task. This makes it not only possible to include a more diverse population and real-life environments into the evaluation, but also reduces the turn-around time and increases the number of subjects participating in an evaluation campaign significantly, thereby circumventing bottle-necks in traditional laboratory setups. In order to utilise these advantages, the differences between laboratory-based and crowd-based QoE evaluation are discussed in this chapter. | eng |
| dc.description.version | published | eng |
| dc.identifier.doi | 10.1007/978-3-319-66435-4_7 | eng |
| dc.identifier.ppn | 505522144 | |
| dc.identifier.uri | https://kops.uni-konstanz.de/handle/123456789/41214 | |
| dc.language.iso | eng | eng |
| dc.rights | terms-of-use | |
| dc.rights.uri | https://rightsstatements.org/page/InC/1.0/ | |
| dc.subject.ddc | 004 | eng |
| dc.title | Crowdsourcing Quality of Experience Experiments | eng |
| dc.type | INPROCEEDINGS | eng |
| dspace.entity.type | Publication | |
| kops.citation.bibtex | @inproceedings{EggerLampl2017-09-28Crowd-41214,
year={2017},
doi={10.1007/978-3-319-66435-4_7},
title={Crowdsourcing Quality of Experience Experiments},
number={10264},
isbn={978-3-319-66434-7},
issn={0302-9743},
publisher={Springer},
address={Cham},
series={Lecture Notes in Computer Science},
booktitle={Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions},
pages={154--190},
editor={Archambault, Daniel and Purchase, Helen and Hoßfeld, Tobias},
author={Egger-Lampl, Sebastian and Redi, Judith and Hoßfeld, Tobias and Hirth, Matthias and Möller, Sebastian and Naderi, Babak and Keimel, Christian and Saupe, Dietmar}
} | |
| kops.citation.iso690 | EGGER-LAMPL, Sebastian, Judith REDI, Tobias HOSSFELD, Matthias HIRTH, Sebastian MÖLLER, Babak NADERI, Christian KEIMEL, Dietmar SAUPE, 2017. Crowdsourcing Quality of Experience Experiments. Dagstuhl Seminar 15481. Dagstuhl, Wadern, 22. Nov. 2015 - 27. Nov. 2015. In: ARCHAMBAULT, Daniel, ed., Helen PURCHASE, ed., Tobias HOSSFELD, ed.. Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions. Cham: Springer, 2017, pp. 154-190. Lecture Notes in Computer Science. 10264. ISSN 0302-9743. eISSN 1611-3349. ISBN 978-3-319-66434-7. Available under: doi: 10.1007/978-3-319-66435-4_7 | deu |
| kops.citation.iso690 | EGGER-LAMPL, Sebastian, Judith REDI, Tobias HOSSFELD, Matthias HIRTH, Sebastian MÖLLER, Babak NADERI, Christian KEIMEL, Dietmar SAUPE, 2017. Crowdsourcing Quality of Experience Experiments. Dagstuhl Seminar 15481. Dagstuhl, Wadern, Nov 22, 2015 - Nov 27, 2015. In: ARCHAMBAULT, Daniel, ed., Helen PURCHASE, ed., Tobias HOSSFELD, ed.. Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions. Cham: Springer, 2017, pp. 154-190. Lecture Notes in Computer Science. 10264. ISSN 0302-9743. eISSN 1611-3349. ISBN 978-3-319-66434-7. Available under: doi: 10.1007/978-3-319-66435-4_7 | eng |
| kops.citation.rdf | <rdf:RDF
xmlns:dcterms="http://purl.org/dc/terms/"
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:bibo="http://purl.org/ontology/bibo/"
xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:void="http://rdfs.org/ns/void#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#" >
<rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/41214">
<dc:contributor>Saupe, Dietmar</dc:contributor>
<dc:creator>Hirth, Matthias</dc:creator>
<dc:creator>Redi, Judith</dc:creator>
<dc:contributor>Keimel, Christian</dc:contributor>
<foaf:homepage rdf:resource="http://localhost:8080/"/>
<dc:contributor>Naderi, Babak</dc:contributor>
<dc:creator>Naderi, Babak</dc:creator>
<dc:creator>Keimel, Christian</dc:creator>
<dc:creator>Saupe, Dietmar</dc:creator>
<dc:contributor>Redi, Judith</dc:contributor>
<dc:creator>Möller, Sebastian</dc:creator>
<dcterms:title>Crowdsourcing Quality of Experience Experiments</dcterms:title>
<dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2018-02-02T12:11:32Z</dcterms:available>
<dc:contributor>Egger-Lampl, Sebastian</dc:contributor>
<dc:contributor>Hoßfeld, Tobias</dc:contributor>
<dcterms:abstract xml:lang="eng">Crowdsourcing enables new possibilities for QoE evaluation by moving the evaluation task from the traditional laboratory environment into the Internet, allowing researchers to easily access a global pool of workers for the evaluation task. This makes it not only possible to include a more diverse population and real-life environments into the evaluation, but also reduces the turn-around time and increases the number of subjects participating in an evaluation campaign significantly, thereby circumventing bottle-necks in traditional laboratory setups. In order to utilise these advantages, the differences between laboratory-based and crowd-based QoE evaluation are discussed in this chapter.</dcterms:abstract>
<dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/41214/1/Egger-Lampl_2-5iaoqamuzfkz9.pdf"/>
<void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
<dc:creator>Egger-Lampl, Sebastian</dc:creator>
<dc:contributor>Hirth, Matthias</dc:contributor>
<dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/>
<dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
<dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2018-02-02T12:11:32Z</dc:date>
<bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/41214"/>
<dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/41214/1/Egger-Lampl_2-5iaoqamuzfkz9.pdf"/>
<dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
<dc:language>eng</dc:language>
<dc:contributor>Möller, Sebastian</dc:contributor>
<dc:rights>terms-of-use</dc:rights>
<dcterms:issued>2017-09-28</dcterms:issued>
<dc:creator>Hoßfeld, Tobias</dc:creator>
</rdf:Description>
</rdf:RDF> | |
| kops.conferencefield | Dagstuhl Seminar 15481, 22. Nov. 2015 - 27. Nov. 2015, Dagstuhl, Wadern | deu |
| kops.date.conferenceEnd | 2015-11-27 | eng |
| kops.date.conferenceStart | 2015-11-22 | eng |
| kops.description.openAccess | openaccessgreen | |
| kops.flag.knbibliography | true | |
| kops.identifier.nbn | urn:nbn:de:bsz:352-2-5iaoqamuzfkz9 | |
| kops.location.conference | Dagstuhl, Wadern | eng |
| kops.sourcefield | ARCHAMBAULT, Daniel, ed., Helen PURCHASE, ed., Tobias HOSSFELD, ed.. <i>Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions</i>. Cham: Springer, 2017, pp. 154-190. Lecture Notes in Computer Science. 10264. ISSN 0302-9743. eISSN 1611-3349. ISBN 978-3-319-66434-7. Available under: doi: 10.1007/978-3-319-66435-4_7 | deu |
| kops.sourcefield.plain | ARCHAMBAULT, Daniel, ed., Helen PURCHASE, ed., Tobias HOSSFELD, ed.. Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions. Cham: Springer, 2017, pp. 154-190. Lecture Notes in Computer Science. 10264. ISSN 0302-9743. eISSN 1611-3349. ISBN 978-3-319-66434-7. Available under: doi: 10.1007/978-3-319-66435-4_7 | deu |
| kops.sourcefield.plain | ARCHAMBAULT, Daniel, ed., Helen PURCHASE, ed., Tobias HOSSFELD, ed.. Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions. Cham: Springer, 2017, pp. 154-190. Lecture Notes in Computer Science. 10264. ISSN 0302-9743. eISSN 1611-3349. ISBN 978-3-319-66434-7. Available under: doi: 10.1007/978-3-319-66435-4_7 | eng |
| kops.title.conference | Dagstuhl Seminar 15481 | eng |
| relation.isAuthorOfPublication | fffb576d-6ec6-4221-8401-77f1d117a9b9 | |
| relation.isAuthorOfPublication.latestForDiscovery | fffb576d-6ec6-4221-8401-77f1d117a9b9 | |
| source.bibliographicInfo.fromPage | 154 | eng |
| source.bibliographicInfo.seriesNumber | 10264 | eng |
| source.bibliographicInfo.toPage | 190 | eng |
| source.contributor.editor | Archambault, Daniel | |
| source.contributor.editor | Purchase, Helen | |
| source.contributor.editor | Hoßfeld, Tobias | |
| source.identifier.eissn | 1611-3349 | eng |
| source.identifier.isbn | 978-3-319-66434-7 | eng |
| source.identifier.issn | 0302-9743 | eng |
| source.publisher | Springer | eng |
| source.publisher.location | Cham | eng |
| source.relation.ispartofseries | Lecture Notes in Computer Science | eng |
| source.title | Evaluation in the Crowd : Crowdsourcing and Human-Centered Experiments : Revised Contributions | eng |
Dateien
Originalbündel
1 - 1 von 1
Vorschaubild nicht verfügbar
- Name:
- Egger-Lampl_2-5iaoqamuzfkz9.pdf
- Größe:
- 329 KB
- Format:
- Adobe Portable Document Format
- Beschreibung:
