Publikation: Stochastic Gradient Descent and its Application for Parametrized Boundary Value Problems under Uncertainties
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
Internationale Patentnummer
Link zur Lizenz
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
In this thesis we want to give a theoretical and practical introduction to stochastic gradient descent (SGD) methods. In the theoretical part, we prove two fundamental convergence results that hold under certain assumptions, like a strongly convex objective function. The first result covers the convergence behaviour of SGD running with a fixed step size sequence and is expanded to the second result, which deals with SGD running with a diminishing step size sequence. For both cases, we provide an upper bound for the expected optimality gap. At the expense of a concrete convergence rate, we then generalize both results to non-convex objective functions. The practical part of this thesis deals with the application of SGD as a convincing and stable optimizer for parametrized boundary value problems under uncertainties. Firstly, we discretize an ordinary differential equation (ODE) Dirichlet problem using finite differences (FD) and improve the results by using preconditioning techniques and a weighted norm. Secondly, we generalize the results to an elliptic partial differential equation (PDE) Dirichlet problem and aim for a weak solution using a finite element (FE) discretization. For both problems, the SGD algorithm convinces with stable results and provides convergence in expectation.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
WOLF, Florian, 2021. Stochastic Gradient Descent and its Application for Parametrized Boundary Value Problems under Uncertainties [Bachelor thesis]. Konstanz: Universität KonstanzBibTex
@mastersthesis{Wolf2021Stoch-54423, year={2021}, title={Stochastic Gradient Descent and its Application for Parametrized Boundary Value Problems under Uncertainties}, address={Konstanz}, school={Universität Konstanz}, author={Wolf, Florian} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/54423"> <dc:contributor>Wolf, Florian</dc:contributor> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/54423/3/Wolf_2-1as0cw2bsjlzs7.pdf"/> <dc:language>eng</dc:language> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2021-07-28T05:57:47Z</dc:date> <dcterms:title>Stochastic Gradient Descent and its Application for Parametrized Boundary Value Problems under Uncertainties</dcterms:title> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/54423"/> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2021-07-28T05:57:47Z</dcterms:available> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:rights>Attribution-NonCommercial-ShareAlike 4.0 International</dc:rights> <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/54423/3/Wolf_2-1as0cw2bsjlzs7.pdf"/> <dcterms:issued>2021</dcterms:issued> <dc:creator>Wolf, Florian</dc:creator> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/39"/> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/39"/> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by-nc-sa/4.0/"/> <dcterms:abstract xml:lang="eng">In this thesis we want to give a theoretical and practical introduction to stochastic gradient descent (SGD) methods. In the theoretical part, we prove two fundamental convergence results that hold under certain assumptions, like a strongly convex objective function. The first result covers the convergence behaviour of SGD running with a fixed step size sequence and is expanded to the second result, which deals with SGD running with a diminishing step size sequence. For both cases, we provide an upper bound for the expected optimality gap. At the expense of a concrete convergence rate, we then generalize both results to non-convex objective functions. The practical part of this thesis deals with the application of SGD as a convincing and stable optimizer for parametrized boundary value problems under uncertainties. Firstly, we discretize an ordinary differential equation (ODE) Dirichlet problem using finite differences (FD) and improve the results by using preconditioning techniques and a weighted norm. Secondly, we generalize the results to an elliptic partial differential equation (PDE) Dirichlet problem and aim for a weak solution using a finite element (FE) discretization. For both problems, the SGD algorithm convinces with stable results and provides convergence in expectation.</dcterms:abstract> </rdf:Description> </rdf:RDF>