How internal noise impacts the performance of recurrent neural network
Nadezhda Semenova1;
1Saratov State University, Saratov, Russia
Abstract
In recent years, more and more works have appeared devoted to the analog (hardware) implementation of artificial neural networks, in which neurons and the connection between them are based not on computer calculations, but on physical principles. Such networks offer improved energy efficiency and, in some cases, scalability, but may be susceptible to internal noise. Here we study the influence of noise on the functioning of recurrent networks using the example of trained echo state networks (ESNs). The most common reservoir connection matrices were chosen as various topologies of ESNs: random uniform and band matrices with different connectivity. White Gaussian noise was chosen as the influence, and according to the way of introducing it was additive or multiplicative, as well as correlated or uncorrelated. We show that the propagation of noise in reservoir is mainly controlled by the statistical properties of the output connection matrix, namely the mean and the mean square. Depending on these values, more correlated or uncorrelated noise accumulates in the network. We also show that there are conditions under which even noise with an intensity of 10^−20 is already enough to completely lose the useful signal. In the article we show which types of noise are most critical for networks with different activation functions (hyperbolic tangent, sigmoid and linear) and if the network is self-closed.
This work was supported by the Russian Science Foundation (project No. 23-72-01094)
https://rscf.ru/project/23-72-01094/
Speaker
Nadezhda Semenova
Saratov State University
Russia
Discussion
Ask question