Impact of noise on analog neural networks
Nadezhda Semenova1,2, Laurent Larger2, Daniel Brunner2
1Saratov State University, Russia
2Département d’Optique P. M. Duffieux, Institut FEMTO-ST, Université Bourgogne-Franche-Comté CNRS UMR 6174, Besançon, France
Abstract
Maximal computing performance can be achieved if neural networks are fully hardware implemented. Besides the potentially large benefits, such parallel and analogue hardware platforms faces new, fundamental challenges. An important concern is that such systems might ultimately succumb to the detrimental impact of noise. We study of noise propagation [1,2] through deep neural networks with various neuron nonlinearities and trained via back-propagation for two different tasks. We consider correlated and uncorrelated, multiplicative and additive noise and use noise amplitudes extracted from a physical experiment [3]. Importantly, results of numerical simulations are in excellent agreement with our developed analytical descriptions, which clearly identify the fundamental properties of noise propagation in future hardware implementations. Noise either in the input and output layer, or which is correlated across populations of hidden layer neurons clearly is the main nuisance. We analyze the impact of activation function, coupling topology and biases on the noise propagation. The obtained results can be successfully applied to trained networks demonstrating time series prediction or image recognition.
This study is supported by Russian Science Foundation (Project No. 21-72-00002).
[1] N. Semenova, et al. Chaos 29, 103128 (2019).
[2] N. Semenova, et al. arXiv:2103.07413 (2021).
[3] J. Bueno, et al., Optica 5, 756-760 (2018).
Speaker
Nadezhda Semenova
Saratov State University
Russia
Discussion
Ask question