Bayesian Statistics as an Alternative to Gradient Descent in Sequence Learning

Authors

  • Rainer Spiegel Ludwig Maximilians Universität

DOI:

https://doi.org/10.3991/ijet.v2i3.100

Abstract


Recurrent neural networks are frequently applied to simulate sequence learning applications such as language processing, sensory-motor learning, etc. For this purpose, they often apply a truncated gradient descent (=error correcting) learning algorithm. In order to converge to a solution that is congruent with a target set of sequences, many iterations of sequence presentations and weight adjustments are typically needed. Moreover, there is no guarantee of finding the global minimum of error in a multidimensional error landscape resulting from the discrepancy between target values and the networkâ??s prediction. This paper presents a new approach of inferring the global error minimum right from the start. It further applies this information to reverse-engineer the weights. As a consequence, learning is speeded-up tremendously, whilst computationally-expensive iterative training trials can be skipped. Technology applications in established and emerging industries will be discussed.

Author Biography

Rainer Spiegel, Ludwig Maximilians Universität

PhD (received from University of Cambridge, England, 2002) Fellow (Wolfson College, University of Cambridge, 2002 to 2006) 2004 to 2007 Lecturer in Medical Psychology for the Medical School of Ludwig-Maximilians-University, Munich 2004 to 2007: Group leader of the Sensory-motor Learning Lab, Ludwig-Maximilians University, Munich

Downloads

Published

2007-06-15

How to Cite

Spiegel, R. (2007). Bayesian Statistics as an Alternative to Gradient Descent in Sequence Learning. International Journal of Emerging Technologies in Learning (iJET), 2(3). https://doi.org/10.3991/ijet.v2i3.100

Issue

Section

Papers