Griffin, David and Stovold, James and O’Keefe, Simon and Stepney, Susan (2026) Evaluating ESNs Against Lagged Input Regression Computation. In: Unconventional Computation and Natural Computation Echo State Networks (ESNs) are stated in literature to use a random structure to project an input sequence into a higher dimensional space where the input becomes linearly separable. However, the linear m. Lecture Notes in Computer Science . Springer, Cham, pp. 262-276. ISBN 9783032156402
Full text not available from this repository.Abstract
Echo State Networks (ESNs) are stated in literature to use a random structure to project an input sequence into a higher dimensional space where the input becomes linearly separable. However, the linear mathematics used for this projection are incapable of increasing the dimensionality of the input, and the commonly used tanh() activation function tends not to produce much nonlinearity. Therefore, any increase in dimensionality is due to the echoes of the ESN. We introduce Lagged Input Regression Computation to investigate what types of ESN can be replaced with simpler non-randomised structures. We show that tanh()-based ESNs behave as simple linear memory systems, whereas LeakyReLU provides a more effective non-linearity. We also show that the use of certain orthogonal polynomials in defining nonlinear memory capacity benchmarks gives a misleading impression of nonlinearity, due to the relevant high order polynomials nevertheless containing a linear term.
Altmetric
Altmetric