On self-organizing reservoirs and their hierarchies

  • Current advances in reservoir computing have demonstrated that fixed random recurrent networks with only readouts trained often outperform fully-trained recurrent neural networks. While full supervised training of such networks is problematic, intuitively there should also be something better than a random network. In this contribution we investigate a different approach which is in between the two. We use reservoirs derived from recursive self-organizing maps that are trained in an unsupervised way and later tested by training supervised readouts. This approach enables us to train greedy unsupervised hierarchies of such dynamic reservoirs. We demonstrate in a rigorous way the advantage of using the self-organizing reservoirs over the traditional random ones and using hierarchies of such over single reservoirs with a synthetic handwriting-like temporal pattern recognition dataset.

Download full text

Cite this publication

  • Export Bibtex
  • Export RIS

Citable URL (?):

Search for this publication

Search Google Scholar Search Catalog of German National Library Search OCLC WorldCat Search Catalog of GBV Common Library Network Search Catalog of Jacobs University Library Search Bielefeld Academic Search Engine
Meta data
Publishing Institution:IRC-Library, Information Resource Center der Jacobs University Bremen
Author:Mantas Lukosevicius
Persistent Identifier (URN):urn:nbn:de:gbv:579-opus-1006843
Series (No.):Jacobs University Technical Reports (25)
Document Type:Technical Report
Language:English
Date of First Publication:2010/10/01
School:SES School of Engineering and Science
Library of Congress Classification:Q Science / QA Mathematics (incl. computer science) / QA71-90 Instruments and machines / QA75.5-76.95 Electronic computers. Computer science / QA76.87 Neural computers. Neural networks

$Rev: 13581 $