Nonlinear gradient denoising : Finding accurate extrema from inaccurate functional derivatives
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
DOI (zitierfähiger Link)
Internationale Patentnummer
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
A method for nonlinear optimization with machine learning (ML) models, called nonlinear gradient denoising (NLGD), is developed, and applied with ML approximations to the kinetic energy density functional in an orbital‐free density functional theory. Due to systematically inaccurate gradients of ML models, in particular when the data is very high‐dimensional, the optimization must be constrained to the data manifold. We use nonlinear kernel principal component analysis (PCA) to locally reconstruct the manifold, enabling a projected gradient descent along it. A thorough analysis of the method is given via a simple model, designed to clarify the concepts presented. Additionally, NLGD is compared with the local PCA method used in previous work. Our method is shown to be superior in cases when the data manifold is highly nonlinear and high dimensional. Further applications of the method in both density functional theory and ML are discussed.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
SNYDER, John C., Matthias RUPP, Klaus-Robert MÜLLER, Kieron BURKE, 2015. Nonlinear gradient denoising : Finding accurate extrema from inaccurate functional derivatives. In: International Journal of Quantum Chemistry. Wiley-Blackwell. 2015, 115(16), pp. 1102-1114. ISSN 0020-7608. eISSN 1097-461X. Available under: doi: 10.1002/qua.24937BibTex
@article{Snyder2015Nonli-52135, year={2015}, doi={10.1002/qua.24937}, title={Nonlinear gradient denoising : Finding accurate extrema from inaccurate functional derivatives}, number={16}, volume={115}, issn={0020-7608}, journal={International Journal of Quantum Chemistry}, pages={1102--1114}, author={Snyder, John C. and Rupp, Matthias and Müller, Klaus-Robert and Burke, Kieron} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/52135"> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2020-12-15T13:15:48Z</dc:date> <dc:rights>terms-of-use</dc:rights> <dc:creator>Rupp, Matthias</dc:creator> <dc:contributor>Burke, Kieron</dc:contributor> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:creator>Müller, Klaus-Robert</dc:creator> <dcterms:issued>2015</dcterms:issued> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/52135"/> <dc:creator>Snyder, John C.</dc:creator> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2020-12-15T13:15:48Z</dcterms:available> <dc:contributor>Snyder, John C.</dc:contributor> <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/> <dc:creator>Burke, Kieron</dc:creator> <dc:contributor>Rupp, Matthias</dc:contributor> <dc:contributor>Müller, Klaus-Robert</dc:contributor> <dc:language>eng</dc:language> <dcterms:abstract xml:lang="eng">A method for nonlinear optimization with machine learning (ML) models, called nonlinear gradient denoising (NLGD), is developed, and applied with ML approximations to the kinetic energy density functional in an orbital‐free density functional theory. Due to systematically inaccurate gradients of ML models, in particular when the data is very high‐dimensional, the optimization must be constrained to the data manifold. We use nonlinear kernel principal component analysis (PCA) to locally reconstruct the manifold, enabling a projected gradient descent along it. A thorough analysis of the method is given via a simple model, designed to clarify the concepts presented. Additionally, NLGD is compared with the local PCA method used in previous work. Our method is shown to be superior in cases when the data manifold is highly nonlinear and high dimensional. Further applications of the method in both density functional theory and ML are discussed.</dcterms:abstract> <dcterms:title>Nonlinear gradient denoising : Finding accurate extrema from inaccurate functional derivatives</dcterms:title> </rdf:Description> </rdf:RDF>