Publikation:

Differentiable Top-k Classification Learning

Lade...
Vorschaubild

Dateien

Petersen_2-1b1lxhiqw7rwh1.pdf
Petersen_2-1b1lxhiqw7rwh1.pdfGröße: 667.6 KBDownloads: 62

Datum

2022

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

DOI (zitierfähiger Link)
ArXiv-ID

Internationale Patentnummer

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Open Access Bookpart
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Beitrag zu einem Konferenzband
Publikationsstatus
Published

Erschienen in

CHAUDHURI, Kamalika, ed., Stefanie JEGELKA, ed., Le SONG, ed. and others. International Conference on Machine Learning, Vol. 162. PLMR, 2022, pp. 17656-17668

Zusammenfassung

The top-k classification accuracy is one of the core metrics in machine learning. Here, k is conventionally a positive integer, such as 1 or 5, leading to top-1 or top-5 training objectives. In this work, we relax this assumption and optimize the model for multiple k simultaneously instead of using a single k. Leveraging recent advances in differentiable sorting and ranking, we propose a family of differentiable top-k cross-entropy classification losses. This allows training while not only considering the top-1 prediction, but also, e.g., the top-2 and top-5 predictions. We evaluate the proposed losses for fine-tuning on state-of-the-art architectures, as well as for training from scratch. We find that relaxing k not only produces better top-5 accuracies, but also leads to top-1 accuracy improvements. When fine-tuning publicly available ImageNet models, we achieve a new state-of-the-art for these models.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
004 Informatik

Schlagwörter

Konferenz

39th International Conference on Machine Learning : PLMR 162, 17. Juli 2022 - 23. Juli 2022, Baltimore, Maryland
Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Zugehörige Datensätze in KOPS

Zitieren

ISO 690PETERSEN, Felix, Hilde KUEHNE, Christian BORGELT, Oliver DEUSSEN, 2022. Differentiable Top-k Classification Learning. 39th International Conference on Machine Learning : PLMR 162. Baltimore, Maryland, 17. Juli 2022 - 23. Juli 2022. In: CHAUDHURI, Kamalika, ed., Stefanie JEGELKA, ed., Le SONG, ed. and others. International Conference on Machine Learning, Vol. 162. PLMR, 2022, pp. 17656-17668
BibTex
@inproceedings{Petersen2022Diffe-67074,
  year={2022},
  title={Differentiable Top-k Classification Learning},
  url={https://proceedings.mlr.press/v162/petersen22a.html},
  publisher={PLMR},
  booktitle={International Conference on Machine Learning, Vol. 162},
  pages={17656--17668},
  editor={Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le},
  author={Petersen, Felix and Kuehne, Hilde and Borgelt, Christian and Deussen, Oliver}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/67074">
    <dc:creator>Kuehne, Hilde</dc:creator>
    <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/>
    <dc:contributor>Petersen, Felix</dc:contributor>
    <dc:creator>Borgelt, Christian</dc:creator>
    <dc:creator>Petersen, Felix</dc:creator>
    <dc:rights>terms-of-use</dc:rights>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
    <dc:contributor>Borgelt, Christian</dc:contributor>
    <dc:contributor>Deussen, Oliver</dc:contributor>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2023-06-07T07:14:00Z</dcterms:available>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/67074/1/Petersen_2-1b1lxhiqw7rwh1.pdf"/>
    <dcterms:abstract>The top-k classification accuracy is one of the core metrics in machine learning. Here, k is conventionally a positive integer, such as 1 or 5, leading to top-1 or top-5 training objectives. In this work, we relax this assumption and optimize the model for multiple k simultaneously instead of using a single k. Leveraging recent advances in differentiable sorting and ranking, we propose a family of differentiable top-k cross-entropy classification losses. This allows training while not only considering the top-1 prediction, but also, e.g., the top-2 and top-5 predictions. We evaluate the proposed losses for fine-tuning on state-of-the-art architectures, as well as for training from scratch. We find that relaxing k not only produces better top-5 accuracies, but also leads to top-1 accuracy improvements. When fine-tuning publicly available ImageNet models, we achieve a new state-of-the-art for these models.</dcterms:abstract>
    <dcterms:title>Differentiable Top-k Classification Learning</dcterms:title>
    <dc:creator>Deussen, Oliver</dc:creator>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/67074"/>
    <dc:contributor>Kuehne, Hilde</dc:contributor>
    <dc:language>eng</dc:language>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
    <dcterms:issued>2022</dcterms:issued>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/67074/1/Petersen_2-1b1lxhiqw7rwh1.pdf"/>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2023-06-07T07:14:00Z</dc:date>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt

Prüfdatum der URL

2023-06-07

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja
Begutachtet
Diese Publikation teilen