|国家预印本平台
首页|Internal noise in hardware deep and recurrent neural networks helps with learning

Internal noise in hardware deep and recurrent neural networks helps with learning

Internal noise in hardware deep and recurrent neural networks helps with learning

来源:Arxiv_logoArxiv
英文摘要

Recently, the field of hardware neural networks has been actively developing, where neurons and their connections are not simulated on a computer but are implemented at the physical level, transforming the neural network into a tangible device. In this paper, we investigate how internal noise during the training of neural networks affects the final performance of recurrent and deep neural networks. We consider feedforward networks (FNN) and echo state networks (ESN) as examples. The types of noise examined originated from a real optical implementation of a neural network. However, these types were subsequently generalized to enhance the applicability of our findings on a broader scale. The noise types considered include additive and multiplicative noise, which depend on how noise influences each individual neuron, and correlated and uncorrelated noise, which pertains to the impact of noise on groups of neurons (such as the hidden layer of FNNs or the reservoir of ESNs). In this paper, we demonstrate that, in most cases, both deep and echo state networks benefit from internal noise during training, as it enhances their resilience to noise. Consequently, the testing performance at the same noise intensities is significantly higher for networks trained with noise than for those trained without it. Notably, only multiplicative correlated noise during training has minimal has almost no impact on both deep and recurrent networks.

Nadezhda Semenova、Ivan Kolesnikov

光电子技术计算技术、计算机技术

Nadezhda Semenova,Ivan Kolesnikov.Internal noise in hardware deep and recurrent neural networks helps with learning[EB/OL].(2025-04-18)[2025-04-29].https://arxiv.org/abs/2504.13778.点此复制

评论