Constructive role of plasticity rules in reservoir computing

Show simple item record

dc.contributor Mirasso Santos, Claudio Rubén
dc.contributor Cornelles Soriano, Miguel
dc.contributor.author Barrios Morales, Guillermo Gabriel
dc.date 2019
dc.date.accessioned 2020-04-15T09:12:18Z
dc.date.available 2020-04-15T09:12:18Z
dc.date.issued 2019-09-20
dc.identifier.uri http://hdl.handle.net/11201/152025
dc.description.abstract [eng] Over the last 15 years, Reservoir Computing (RC) has emerged as an appealing approach in Machine Learning, combining the high computational capabilities of Recurrent Neural Networks with a fast and easy training. By mapping the inputs into a high-dimensional space of non-linear neurons, this class of algorithms have shown their utility in a wide range of tasks from speech recognition to time series prediction. With their popularity on the rise, new works have pointed to the possibility of RC as an existing learning paradigm within the actual brain. Likewise, successful implementation of biologically based plasticity rules into RC artificial networks has boosted the performance of the original models. Within these nature-inspired approaches, most research lines focus on improving the performance achieved by previous works on prediction or classification tasks. In this thesis however, we will address the problem from a different perspective: instead on focusing on the results of the improved models, we will analyze the role of plasticity rules on the changes that lead to a better performance. To this end, we implement synaptic and non-synaptic plasticity rules in a standard Echo State Network , which is a paradigmatic example of an RC model. Testing on temporal series prediction tasks, we show evidence that improved performance in all plastic models may be linked to a decrease in spatio-temporal correlations in the reservoir, as well as a significant increase on individual neurons ability to separate similar inputs in their activity space. From the perspective of the reservoir dynamics, optimal performance is suggested to occur at the edge of instability. This is a hypothesis previously suggested in literature, but we hope to provide new insight on the matter through the study of different stages on the plastic learning. Finally, we show that it is possible to combine different forms of plasticity (namely synaptic and non-synaptic rules) to further improve the performance on prediction tasks, obtaining better results than those achieved with single-plasticity models. ca
dc.format application/pdf
dc.language.iso eng ca
dc.publisher Universitat de les Illes Balears
dc.rights all rights reserved
dc.rights info:eu-repo/semantics/openAccess
dc.subject 53 - Física ca
dc.subject.other Reservoir Computing ca
dc.subject.other Plasticity rules ca
dc.subject.other Hebbian learning ca
dc.subject.other Intrinsic plasticity ca
dc.subject.other Temporal series prediction ca
dc.title Constructive role of plasticity rules in reservoir computing ca
dc.type info:eu-repo/semantics/masterThesis ca
dc.type info:eu-repo/semantics/publishedVersion
dc.date.updated 2019-11-29T10:56:54Z


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search Repository


Advanced Search

Browse

My Account

Statistics