Echo State Networks with Trained Feedbacks
- Echo State Networks (ESNs) is an approach to the recurrent neural network (RNN) training, based on generating a big random network (reservoir) of sparsely inter-connected neurons and learning only a single layer of output weights from the reservoir as the target function. Despite many advantages of ESNs over gradient based RNN training techniques, they lack the power of learning some complex functions. New findings in dynamical systems theory state, that fixed neural circuits can obtain universal computational qualities if suitable feedbacks (or intermediate units) can be trained. Unfortunately the theory gives no hint on how this can be done. In this report we explore possible directions in which the theoretical findings could be applied to increase the computational power of ESNs. More specifically, we discuss possible options for defining training targets for the feedbacks, present and discuss some empirical results (positive as well as negative) testing the ideas in practice and analyze some problems pointing out to some intrinsic limitations of ESNs. Another contribution of this report is a discussion of many practical issues of training ESNs in particular the ones having feedback connections. We also propose a modification of ESNs called Layered ESNs. This technical report is based on the author’s Master Thesis named "Improving Echo State Networks by Training Intermediate Units".