No-reference Video Quality Assessment and Applications


Dateien zu dieser Ressource

Prüfsumme: MD5:fde94ce37537559254d6a8af547b7048

ZHU, Kongfeng, 2014. No-reference Video Quality Assessment and Applications

@phdthesis{Zhu2014No-re-28920, title={No-reference Video Quality Assessment and Applications}, year={2014}, author={Zhu, Kongfeng}, address={Konstanz}, school={Universität Konstanz} }

<rdf:RDF xmlns:rdf="" xmlns:bibo="" xmlns:dc="" xmlns:dcterms="" xmlns:xsd="" > <rdf:Description rdf:about=""> <dc:language>eng</dc:language> <dcterms:available rdf:datatype="">2014-09-04T12:52:27Z</dcterms:available> <dc:date rdf:datatype="">2014-09-04T12:52:27Z</dc:date> <dcterms:abstract xml:lang="eng">With more and more visual signals being received by human observers, an important aspect of the quality of experience of such stimuli is the perceived visual quality. In this thesis, new techiques to assess this perceived visual quality of natural videos without a pristine reference video, referred to as no-reference video quality assessment (NR-VQA), are presented, in order to evaluate the performance of existing devices for video capturing or video compression. These techniques adopt a two-stage NR-VQA framework, in which the two stages are distortion measurement and quality prediction. Three NR-VQA metrics are designed to evaluate the performance of video imaging systems, while two computational NR-VQA models are proposed to assess the quality of compressed videos. An optimizing strategy is also designed for feature pooling and prediction models of NR-VQA algorithms.</dcterms:abstract> <dcterms:rights rdf:resource=""/> <bibo:uri rdf:resource=""/> <dcterms:title>No-reference Video Quality Assessment and Applications</dcterms:title> <dc:rights>deposit-license</dc:rights> <dc:creator>Zhu, Kongfeng</dc:creator> <dcterms:issued>2014</dcterms:issued> <dc:contributor>Zhu, Kongfeng</dc:contributor> </rdf:Description> </rdf:RDF>

Dateiabrufe seit 01.10.2014 (Informationen über die Zugriffsstatistik)

Zhu_289206.pdf 621

Das Dokument erscheint in:

KOPS Suche


Mein Benutzerkonto