Error Broadcast and Decorrelation as a Potential Artificial and Natural Learning Mechanism
Error Broadcast and Decorrelation as a Potential Artificial and Natural Learning Mechanism
We introduce Error Broadcast and Decorrelation (EBD), a novel learning framework for neural networks that addresses credit assignment by directly broadcasting output errors to individual layers, circumventing weight transport of backpropagation. EBD is rigorously grounded in the stochastic orthogonality property of Minimum Mean Square Error estimators. This fundamental principle states that the error of an optimal estimator is orthogonal to functions of the input. Guided by this insight, EBD defines layerwise loss functions that directly penalize correlations between layer activations and output errors, thereby establishing a principled foundation for error broadcasting. This theoretically sound mechanism naturally leads to the experimentally observed three-factor learning rule and integrates with biologically plausible frameworks to enhance performance and plausibility. Numerical experiments demonstrate EBD's competitive or better performance against other error-broadcast methods on benchmark datasets. Our findings establish EBD as an efficient, biologically plausible, and principled alternative for neural network training.
Mete Erdogan、Cengiz Pehlevan、Alper T. Erdogan
计算技术、计算机技术
Mete Erdogan,Cengiz Pehlevan,Alper T. Erdogan.Error Broadcast and Decorrelation as a Potential Artificial and Natural Learning Mechanism[EB/OL].(2025-04-15)[2025-05-31].https://arxiv.org/abs/2504.11558.点此复制
评论