SARATOV FALL MEETING SFM 

© 2024 All Rights Reserved

Impact of internal noise on trained deep neural network

Daniil Maksimov1, Nadezhda Semenova1;
1Saratov State University, Saratov, Russia

Abstract

Artificial neural networks (ANN) are a mathematical model inspired by the activity of the human brain. They are used to solve complex problems in various fields, such as pattern recognition, natural language processing, robot control, speech recognition, improving the quality of sound recordings, predicting climate conditions and much more.
Every year, ANNs are developing and improving, and the tasks set may soon lead to a problem, when the capabilities of modern computers and clusters will no longer be sufficient. This is where a new generation of ANNs arises – hardware neural networks. Hardware neural networks (or analog neural networks) are special hardware devices or systems designed to implement neural networks based on physical principles. This work is aimed at studying the model of a deep neural network, to see how additive white Gaussian noise with different intensities affects the functioning of a deep ANN, to compare the accuracy of the ANN with noise and without noise. The network was trained to recognize handwritten digits from the MNIST database, and then it was shown how the internal noise in the hidden layer changes the recognition accuracy, and which noise intensities are critical for this task.
This work was supported by the Russian Science Foundation (project No. 23-72-01094) https://rscf.ru/project/23-72-01094/

Speaker

Daniil Maksimov
Saratov State University
Russia

Discussion

Ask question