Exploring the Concept of Recurrent Neural Networks
Keywords:
Gated recurrent unit (GRU), Long short-term memory (LSTM), Recurrent neural network (RNN), Sequence prediction, Sequential input, Speech recognition, Text generationAbstract
Recurrent neural networks are a type of artificial neural network designed for pattern recognition from data sequences. Examples are time series and natural languages. This paper discusses the design, functioning, and applications of RNNs along with the advantages and disadvantages concerning the alternative architectures of neural networks. Furthermore, the paper delves deeper by exploring the developments in RNN models, such as the widely used Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), addressing the long-standing vanishing gradient issue with which traditional RNNs have been plagued. Finally, we would end with a discussion on the position of RNNs in the landscape of deep learning and possible future courses.
Downloads
Published
2025-04-16
How to Cite
R. Nalawade, S., R. Barapatre, P., H. O., T., & Prasad Kulkarni, P. (2025). Exploring the Concept of Recurrent Neural Networks. Journal of Security in Computer Networks and Distributed Systems, 2, 20–27. Retrieved from https://matjournals.net/engineering/index.php/JoSCNDS/article/view/1734
Issue
Section
Articles