Recurrent Neural Networks for Prediction
Learning Algorithms, Architectures and Stability
AUTHOR: Mandic, Danilo P. and Chambers, Jonathon A.
PUBLISHER: John Wiley & Sons, Incorporated
Also available at Amazon.com
- Analyses the relationships between RNNs and various nonlinear models and filters, and introduces spatio-temporal architectures together with the concepts of modularity and nesting
- Examines stability and relaxation within RNNsPresents on-line learning algorithms for nonlinear adaptive filters and introduces new paradigms which exploit the concepts of a priori and a posteriori errors, data-reusing adaptation, and normalisation
- Studies convergence and stability of on-line learning algorithms based upon optimisation techniques such as contraction mapping and fixed point iteration
- Describes strategies for the exploitation of inherent relationships between parameters in RNNs
- Discusses practical issues such as predictability and nonlinearity detecting and includes several practical applications in areas such as air pollutant modelling and prediction, attractor discovery and chaos, ECG signal processing, and speech processing
Recurrent Neural Networks for Prediction offers a new insight into the learning algorithms, architectures and stability of recurrent neural networks and, consequently, will have instant appeal. It provides an extensive background for researchers, academics and postgraduates enabling them to apply such networks in new applications.
VISIT OUR COMMUNICATIONS TECHNOLOGY WEBSITE!
VISIT OUR WEB PAGE!
PUBLICATION DATE: 9/5/2001