Publikation:

Computability aspects in statistical learning

Lade...
Vorschaubild

Dateien

Kattermann_2-nbxfk51kav277.pdf
Kattermann_2-nbxfk51kav277.pdfGröße: 2.02 MBDownloads: 172

Datum

2025

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

DOI (zitierfähiger Link)
ArXiv-ID

Internationale Patentnummer

Link zur Lizenz

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Open Access Green
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Masterarbeit/Diplomarbeit
Publikationsstatus
Published

Erschienen in

Zusammenfassung

Probably approximately correct (PAC) learning is the standard framework for supervised binary classification in statistical learning theory. Its purpose is to describe from a statistical perspective how machines can learn to make binary decisions based on finitely many examples. A classical result is the Fundamental Theorem of PAC Learning due to Blumer et al. (1989), which characterizes several seemingly different notions of PAC learnability by the finiteness of a combinatorial measure called the VC-dimension.

In this thesis, PAC learning is studied from the viewpoint of computability theory, the standard framework for formalizing algorithms in theoretical computer science. Interest in computability-theoretic aspects of PAC learning has developed in recent years, initially to exclude undecidability results relying on uncomputable learning functions. Unfortunately, it turned out that the classical Fundamental Theorem of PAC Learning fails when learning functions are required to be computable. This causes the landscape of computable PAC learning to be very diverse, with numerous non-equivalent intermediate notions. This thesis provides a comprehensive overview of the field by establishing necessary and sufficient conditions for various notions and constructing counterexamples separating them.

A main result is a computable version of the Fundamental Theorem of PAC Learning due to Delle Rose et al. (2023) that is based on a computable analog of the VC-dimension called the effective VC-dimension. An extensive discussion of this fairly new measure is provided and its relation to the classical VC-dimension is analyzed. Another focus lies on the connection between computable PAC learnability and so-called RER classes. The last part discusses computability in the relaxed setting of nonuniform PAC learning. It is demonstrated that any RER class is nonuniformly PAC learnable with a computable learning function. Some of the results are generalized to continuous domains in the appendix using the framework of computable analysis.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
510 Mathematik

Schlagwörter

Konferenz

Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Zugehörige Datensätze in KOPS

Zitieren

ISO 690KATTERMANN, David, 2025. Computability aspects in statistical learning [Masterarbeit/Diplomarbeit]. Konstanz: Universität Konstanz
BibTex
@mastersthesis{Kattermann2025Compu-74377,
  title={Computability aspects in statistical learning},
  year={2025},
  address={Konstanz},
  school={Universität Konstanz},
  author={Kattermann, David}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/74377">
    <dc:contributor>Kattermann, David</dc:contributor>
    <dcterms:issued>2025</dcterms:issued>
    <dc:rights>Attribution-NonCommercial-NoDerivatives 4.0 International</dc:rights>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dcterms:title>Computability aspects in statistical learning</dcterms:title>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/74377/4/Kattermann_2-nbxfk51kav277.pdf"/>
    <dcterms:abstract>Probably approximately correct (PAC) learning is the standard framework for supervised binary classification in statistical learning theory. Its purpose is to describe from a statistical perspective how machines can learn to make binary decisions based on finitely many examples. A classical result is the Fundamental Theorem of PAC Learning due to Blumer et al. (1989), which characterizes several seemingly different notions of PAC learnability by the finiteness of a combinatorial measure called the VC-dimension.

In this thesis, PAC learning is studied from the viewpoint of computability theory, the standard framework for formalizing algorithms in theoretical computer science. Interest in computability-theoretic aspects of PAC learning has developed in recent years, initially to exclude undecidability results relying on uncomputable learning functions. Unfortunately, it turned out that the classical Fundamental Theorem of PAC Learning fails when learning functions are required to be computable. This causes the landscape of computable PAC learning to be very diverse, with numerous non-equivalent intermediate notions. This thesis provides a comprehensive overview of the field by establishing necessary and sufficient conditions for various notions and constructing counterexamples separating them.

A main result is a computable version of the Fundamental Theorem of PAC Learning due to Delle Rose et al. (2023) that is based on a computable analog of the VC-dimension called the effective VC-dimension. An extensive discussion of this fairly new measure is provided and its relation to the classical VC-dimension is analyzed. Another focus lies on the connection between computable PAC learnability and so-called RER classes. The last part discusses computability in the relaxed setting of nonuniform PAC learning. It is demonstrated that any RER class is nonuniformly PAC learnable with a computable learning function. Some of the results are generalized to continuous domains in the appendix using the framework of computable analysis.</dcterms:abstract>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-08-25T10:22:58Z</dcterms:available>
    <dc:creator>Kattermann, David</dc:creator>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/74377/4/Kattermann_2-nbxfk51kav277.pdf"/>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/74377"/>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-08-25T10:22:58Z</dc:date>
    <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by-nc-nd/4.0/"/>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/39"/>
    <dc:language>eng</dc:language>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/39"/>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Hochschulschriftenvermerk
Konstanz, Universität Konstanz, Masterarbeit/Diplomarbeit, 2025
Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja
Begutachtet
Diese Publikation teilen