The influence of Hebbian learning rules on the dynamics, topology and function of random recurrent neural networks

Hugues Berry, Alchemy Team, INRIA, Orsay, France,

In biological neural networks, local learning rules (Hebbian rules) couple neuron dynamics to network topology (synaptic weights) in a bidirectional fashion. Application of such learning rules in random recurrent neural networks (RRNNs) reduces the chaotic dynamics of these models to simpler attractors. This behavior endows RRNNs with biologically-plausible associative memory properties but remains poorly understood. In particular, our recent simulation results observed that learning comes up with the emergence of a small-world structure in the adjacency network. Here, we analyze RRNNs using both dynamical systems theory and complex networks approaches, and trace the effects of Hebbian rules on the three networks of the system: the Jacobian matrix, the synaptic weight network and the adjacency network. While the information provided by classical statistical indicators of complex networks is limited, we show how the spectrum of these matrices explains the observed modifications of the network dynamics, topology and function.