Dynamical stability for dense patterns in discrete attractor neural networks
Dynamical stability for dense patterns in discrete attractor neural networks
Neural networks storing multiple discrete attractors are canonical models of biological memory. Previously, the dynamical stability of such networks could only be guaranteed under highly restrictive conditions. Here, we derive a theory of the local stability of discrete fixed points in a broad class of networks with graded neural activities and in the presence of noise. By directly analyzing the bulk and outliers of the Jacobian spectrum, we show that all fixed points are stable below a critical load that is distinct from the classical \textit{critical capacity} and depends on the statistics of neural activities in the fixed points as well as the single-neuron activation function. Our analysis highlights the computational benefits of threshold-linear activation and sparse-like patterns.
Uri Cohen、Máté Lengyel
生物物理学计算技术、计算机技术
Uri Cohen,Máté Lengyel.Dynamical stability for dense patterns in discrete attractor neural networks[EB/OL].(2025-07-14)[2025-08-02].https://arxiv.org/abs/2507.10383.点此复制
评论