SARATOV FALL MEETING SFM 

© 2025 All Rights Reserved

Modeling and evaluation of noise effects on the performance of hardware neural networks

I.D. Kolesnikov 1, N.I. Semenova 1;1Saratov State University, Saratov, Russia

Abstract

Neural networks have become a powerful tool in machine learning and artificial intelligence due to their high performance. Special attention is given to hardware neural networks, which are implemented physically rather than through software. However, such systems are vulnerable to noise, especially internal component noise, which can reduce their accuracy and reliability.
This study focuses on two types of noise: additive and multiplicative. A simple neural network model was developed to recognize handwritten digits from the MNIST dataset, with 784 input neurons, 20 hidden, and 10 output neurons. The model was trained using gradient descent, which improves training stability and recognition accuracy.
Experiments showed that additive noise in the hidden layers significantly degrades performance by distorting important features necessary for correct classification. In contrast, multiplicative noise was less damaging, as it mainly changes the input scale without significantly altering the data structure. As a result, neural networks show greater robustness to multiplicative noise.
To evaluate the impact of noise, two metrics were used: variance (the square of the standard deviation) and the signal-to-noise ratio (SNR), calculated as the ratio of the squared mean output signal to its variance.
Keywords: neural networks, noise, machine learning
The research was supported by the Russian Science Foundation, grant No. 25-72-10055, https://rscf.ru/project/25-72-10055/


Speaker

Ivan Kolesnikov
Saratov State University
Russia

Discussion

Ask question