Showing the Equivalence of Two Training Algorithms - Part II
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
DOI (zitierfähiger Link)
Internationale Patentnummer
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
In previous work Graph Transformations have been shown to offer a powerful way to formally specify Neural Networks and their corresponding training algorithms. It has also been shown how to use this formalism to prove properties of the used algorithms. In this paper Graph Transformations are used to show the equivalence of two training algorithms for Recurrent Neural Networks, Back Propagation Through Time and a variant of Real Time Backpropagation. In addition to this proof a whole class of related training algorithm emerges from the used formalism. In part I of this paper the formalization of the two algorithms is shown; part II then shows how Graph Transformations can be used to prove the equivalence of both algorithms.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
FISCHER, Ingrid, Manuel KOCH, Michael R. BERTHOLD, 1998. Showing the Equivalence of Two Training Algorithms - Part II. ICNN '98 - International Conference on Neural Networks. Anchorage, AK, USA. In: 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227). IEEE, 1998, pp. 452-456. ISBN 0-7803-4859-1. Available under: doi: 10.1109/IJCNN.1998.682309BibTex
@inproceedings{Fischer1998Showi-24289, year={1998}, doi={10.1109/IJCNN.1998.682309}, title={Showing the Equivalence of Two Training Algorithms - Part II}, isbn={0-7803-4859-1}, publisher={IEEE}, booktitle={1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227)}, pages={452--456}, author={Fischer, Ingrid and Koch, Manuel and Berthold, Michael R.} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/24289"> <dc:rights>terms-of-use</dc:rights> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dcterms:bibliographicCitation>The 1998 IEEE International Joint Conference on Neural Networks Proceedings : IEEE World Congress on Computational Intelligence : May 4-May 9, 1998, Anchorage, Alaska, USA / [general chair: Patrick K. Simpson]. - Piscataway : IEEE Service Center, 1998. - S. 452-456. - ISBN 0-7803-4859-1</dcterms:bibliographicCitation> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2013-08-22T08:03:12Z</dcterms:available> <dcterms:issued>1998</dcterms:issued> <dc:contributor>Fischer, Ingrid</dc:contributor> <dc:language>eng</dc:language> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2013-08-22T08:03:12Z</dc:date> <dc:contributor>Koch, Manuel</dc:contributor> <dcterms:abstract xml:lang="eng">In previous work Graph Transformations have been shown to offer a powerful way to formally specify Neural Networks and their corresponding training algorithms. It has also been shown how to use this formalism to prove properties of the used algorithms. In this paper Graph Transformations are used to show the equivalence of two training algorithms for Recurrent Neural Networks, Back Propagation Through Time and a variant of Real Time Backpropagation. In addition to this proof a whole class of related training algorithm emerges from the used formalism. In part I of this paper the formalization of the two algorithms is shown; part II then shows how Graph Transformations can be used to prove the equivalence of both algorithms.</dcterms:abstract> <dc:creator>Fischer, Ingrid</dc:creator> <dc:creator>Koch, Manuel</dc:creator> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <bibo:uri rdf:resource="http://kops.uni-konstanz.de/handle/123456789/24289"/> <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/> <dc:contributor>Berthold, Michael R.</dc:contributor> <dcterms:title>Showing the Equivalence of Two Training Algorithms - Part II</dcterms:title> <dc:creator>Berthold, Michael R.</dc:creator> </rdf:Description> </rdf:RDF>