by Fabian Schubert (Institute for Theoretical Physics, Goethe University, Frankfurt am Main, Germany).

Criticality is considered an important property for recurrent neural networks. Close to a critical phase transition, RNNs show improved performance in sequential information processing. While the theoretical framework of reservoir computing provides a conceptual basis for the understanding of recurrent neural computation, it requires manual adjustments of global network parameters, such that the network can operate in a state close to criticality. In the particular case of echo-state networks, the important quantity is the spectral radius of the recurrent synaptic weight matrix. Speaking in terms of biological plausibility, a direct calculation of the spectral radius is not possible. We show, however, that there exists a local and biologically plausible synaptic scaling mechanism, termed flow control, that allows controlling the recurrent weight spectral radius while the network is operating under the influence of external inputs. We demonstrate the effectiveness of the new adaption rule by applying it to echo-state networks and testing their task performance under a time-delayed XOR operation on random binary input sequences. Our adaptation mechanism preserves a stable network performance over a wide range of input strengths. This property makes our mechanism more flexible to changes in the external driving as compared to conventional homeostatic scaling mechanisms that use a fixed internal set point of neural activity.

Spotlight talk presented on October 8th 2020, at the Brain Criticality Virtual Conference 2020 (Plenz D., Chialvo D., de Arcangelis L. & Battaglia D. organizers)

Leave a Reply