Extending Dynamic Memory of Spiking Neuron Networks
No Thumbnail Available
Date
2024-05
Journal Title
Journal ISSN
Volume Title
Journal Title
Chaos, Solitons and Fractals
Volume Title
182
Article Title
114850
Publisher
Elsevier Ltd
Abstract
Explaining the mechanisms of dynamic memory, that allows for a temporary storage of information at the timescale of seconds despite the neuronal firing at the millisecond scale, is an important challenge not only for neuroscience, but also for computation in neuromorphic artificial networks. We demonstrate the potential origin of such longer timescales by comparing the spontaneous activity in excitatory neural networks with sparse random, regular and small-world connection topologies. We derive a mean-field model based on a self-consistent approach and white noise approximation to analyze the transient and long-term collective network dynamics. While the long-term dynamics is typically irregular and weakly correlated independent of the network architecture, especially long timescales are revealed for the transient activity comprised of switching fronts in regular and small-world networks with a small rewiring probability. Analyzing the dynamic memory of networks in performing a simple computational delay task within the framework of reservoir computing, we show that an optimal performance on average is reached for a regular connection topology if the input is appropriately structured, but certain instances of small-world networks may strongly deviate from configuration averages and outperform all the other considered network architectures.