Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules
<br/>Generating functionals may guide the evolution of<br/>a dynamical system and constitute a possible route <br/>for handling the complexity of neural networks as<br/>relevant for computational intelligence. We propose and <br/>explore a new objective function which a...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2014-05-01
|
Series: | Frontiers in Robotics and AI |
Subjects: | |
Online Access: | http://journal.frontiersin.org/Journal/10.3389/frobt.2014.00001/full |
id |
doaj-a3c4a324b66e46ad88ffff425026dd35 |
---|---|
record_format |
Article |
spelling |
doaj-a3c4a324b66e46ad88ffff425026dd352020-11-24T22:43:34ZengFrontiers Media S.A.Frontiers in Robotics and AI2296-91442014-05-01110.3389/frobt.2014.0000191986Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rulesRodrigo eEcheveste0Claudius eGros1Goethe University FrankfurtGoethe University Frankfurt<br/>Generating functionals may guide the evolution of<br/>a dynamical system and constitute a possible route <br/>for handling the complexity of neural networks as<br/>relevant for computational intelligence. We propose and <br/>explore a new objective function which allows to<br/>obtain plasticity rules for the afferent synaptic <br/>weights. The adaption rules are Hebbian and self-limiting<br/>and result from the minimization of the the Fisher <br/>information with respect to the synaptic flux.<br/><br/>We perform a series of simulations examining the behavior of <br/>the new learning rules in various circumstances. The vector <br/>of synaptic weights aligns with the principal direction of <br/>input activities, whenever one is present. A linear <br/>discrimination is performed when there are two or more principal <br/>directions; directions having bimodal firing-rate<br/>distributions, being characterized by a negative excess<br/>kurtosis, are preferred. <br/><br/>We find robust performance and full homeostatic<br/>adaption of the synaptic weights results as a by-product<br/>of the synaptic flux minimization. This self-limiting behavior<br/>allows for stable online learning for arbitrary durations.<br/>The neuron acquires new information when the statistics of<br/>input activities is changed at a certain point of the simulation,<br/>showing however a distinct resilience to unlearn previously <br/>acquired knowledge. Learning is fast when starting with randomly<br/>drawn synaptic weights and substantially slower when the<br/>synaptic weights are already fully adapted. <br/><br/>http://journal.frontiersin.org/Journal/10.3389/frobt.2014.00001/fullsynaptic plasticityHebbian LearningFisher informationgenerating functionalsobjective functionshomeostatic adaption |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Rodrigo eEcheveste Claudius eGros |
spellingShingle |
Rodrigo eEcheveste Claudius eGros Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules Frontiers in Robotics and AI synaptic plasticity Hebbian Learning Fisher information generating functionals objective functions homeostatic adaption |
author_facet |
Rodrigo eEcheveste Claudius eGros |
author_sort |
Rodrigo eEcheveste |
title |
Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules |
title_short |
Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules |
title_full |
Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules |
title_fullStr |
Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules |
title_full_unstemmed |
Generating functionals for computational intelligence: The Fisher information as an objective function for self-limiting Hebbian learning rules |
title_sort |
generating functionals for computational intelligence: the fisher information as an objective function for self-limiting hebbian learning rules |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Robotics and AI |
issn |
2296-9144 |
publishDate |
2014-05-01 |
description |
<br/>Generating functionals may guide the evolution of<br/>a dynamical system and constitute a possible route <br/>for handling the complexity of neural networks as<br/>relevant for computational intelligence. We propose and <br/>explore a new objective function which allows to<br/>obtain plasticity rules for the afferent synaptic <br/>weights. The adaption rules are Hebbian and self-limiting<br/>and result from the minimization of the the Fisher <br/>information with respect to the synaptic flux.<br/><br/>We perform a series of simulations examining the behavior of <br/>the new learning rules in various circumstances. The vector <br/>of synaptic weights aligns with the principal direction of <br/>input activities, whenever one is present. A linear <br/>discrimination is performed when there are two or more principal <br/>directions; directions having bimodal firing-rate<br/>distributions, being characterized by a negative excess<br/>kurtosis, are preferred. <br/><br/>We find robust performance and full homeostatic<br/>adaption of the synaptic weights results as a by-product<br/>of the synaptic flux minimization. This self-limiting behavior<br/>allows for stable online learning for arbitrary durations.<br/>The neuron acquires new information when the statistics of<br/>input activities is changed at a certain point of the simulation,<br/>showing however a distinct resilience to unlearn previously <br/>acquired knowledge. Learning is fast when starting with randomly<br/>drawn synaptic weights and substantially slower when the<br/>synaptic weights are already fully adapted. <br/><br/> |
topic |
synaptic plasticity Hebbian Learning Fisher information generating functionals objective functions homeostatic adaption |
url |
http://journal.frontiersin.org/Journal/10.3389/frobt.2014.00001/full |
work_keys_str_mv |
AT rodrigoeecheveste generatingfunctionalsforcomputationalintelligencethefisherinformationasanobjectivefunctionforselflimitinghebbianlearningrules AT claudiusegros generatingfunctionalsforcomputationalintelligencethefisherinformationasanobjectivefunctionforselflimitinghebbianlearningrules |
_version_ |
1725695048046084096 |