Robust Generalization despite Distribution Shift via Minimum Discriminating Information
Robust Generalization despite Distribution Shift via Minimum Discriminating Information
Vorschaubild nicht verfügbar
Dateien
Zu diesem Dokument gibt es keine Dateien.
Datum
2021
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
eISSN
item.preview.dc.identifier.isbn
Bibliografische Daten
Verlag
Schriftenreihe
Internationale Patentnummer
Link zur Lizenz
EU-Projektnummer
Projekt
Open Access-Veröffentlichung
Sammlungen
Titel in einer weiteren Sprache
Publikationstyp
Beitrag zu einem Konferenzband
Publikationsstatus
Published
Erschienen in
Advances in Neural Information Processing Systems 34 pre-proceedings (NeurIPS 2021) / Ranzato, Marc'Aurelio; Beygelzimer, Alina; Dauphin, Yann et al. (Hrsg.). - San Diego, CA : Neural Information Processing Systems Foundation, 2021
Zusammenfassung
Training models that perform well under distribution shifts is a central challenge in machine learning. In this paper, we introduce a modeling framework where, in addition to training data, we have partial structural knowledge of the shifted test distribution. We employ the principle of minimum discriminating information to embed the available prior knowledge, and use distributionally robust optimization to account for uncertainty due to the limited samples. By leveraging large deviation results, we obtain explicit generalization bounds with respect to the unknown shifted distribution. Lastly, we demonstrate the versatility of our framework by demonstrating it on two rather distinct applications: (1) training classifiers on systematically biased data and (2) off-policy evaluation in Markov Decision Processes.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
004 Informatik
Schlagwörter
Konferenz
NeurIPS 2021 : 35th Conference on Neural Information Processing Systems (online), 6. Dez. 2021 - 14. Dez. 2021
Rezension
undefined / . - undefined, undefined. - (undefined; undefined)
Zitieren
ISO 690
SUTTER, Tobias, Andreas KRAUSE, Daniel KUHN, 2021. Robust Generalization despite Distribution Shift via Minimum Discriminating Information. NeurIPS 2021 : 35th Conference on Neural Information Processing Systems (online), 6. Dez. 2021 - 14. Dez. 2021. In: RANZATO, Marc'Aurelio, ed., Alina BEYGELZIMER, ed., Yann DAUPHIN, ed. and others. Advances in Neural Information Processing Systems 34 pre-proceedings (NeurIPS 2021). San Diego, CA:Neural Information Processing Systems FoundationBibTex
@inproceedings{Sutter2021Robus-55736, year={2021}, title={Robust Generalization despite Distribution Shift via Minimum Discriminating Information}, url={https://proceedings.neurips.cc/paper/2021/hash/f86890095c957e9b949d11d15f0d0cd5-Abstract.html}, publisher={Neural Information Processing Systems Foundation}, address={San Diego, CA}, booktitle={Advances in Neural Information Processing Systems 34 pre-proceedings (NeurIPS 2021)}, editor={Ranzato, Marc'Aurelio and Beygelzimer, Alina and Dauphin, Yann}, author={Sutter, Tobias and Krause, Andreas and Kuhn, Daniel} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/55736"> <dc:contributor>Kuhn, Daniel</dc:contributor> <dc:creator>Kuhn, Daniel</dc:creator> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/55736"/> <dc:rights>terms-of-use</dc:rights> <dc:creator>Krause, Andreas</dc:creator> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2021-12-02T12:27:08Z</dcterms:available> <dc:creator>Sutter, Tobias</dc:creator> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dc:language>eng</dc:language> <dc:contributor>Sutter, Tobias</dc:contributor> <dcterms:title>Robust Generalization despite Distribution Shift via Minimum Discriminating Information</dcterms:title> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dcterms:issued>2021</dcterms:issued> <dcterms:abstract xml:lang="eng">Training models that perform well under distribution shifts is a central challenge in machine learning. In this paper, we introduce a modeling framework where, in addition to training data, we have partial structural knowledge of the shifted test distribution. We employ the principle of minimum discriminating information to embed the available prior knowledge, and use distributionally robust optimization to account for uncertainty due to the limited samples. By leveraging large deviation results, we obtain explicit generalization bounds with respect to the unknown shifted distribution. Lastly, we demonstrate the versatility of our framework by demonstrating it on two rather distinct applications: (1) training classifiers on systematically biased data and (2) off-policy evaluation in Markov Decision Processes.</dcterms:abstract> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2021-12-02T12:27:08Z</dc:date> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/> <dc:contributor>Krause, Andreas</dc:contributor> </rdf:Description> </rdf:RDF>
Interner Vermerk
xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter
URL der Originalveröffentl.
Prüfdatum der URL
2021-11-23
Prüfungsdatum der Dissertation
Finanzierungsart
Kommentar zur Publikation
Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja