Skip to main navigation menu Skip to main content Skip to site footer

GRU-Enhanced Attention Mechanism for LSTM in Hybrid CNN-LSTM Models for Stock Prediction

Abstract

We propose a novel GRU-enhanced attention mechanism integrated into LSTM layers to improve stock prediction accuracy in hybrid CNN-LSTM models. The proposed method dynamically adjusts the importance of different time steps by combining the strengths of GRUs and attention mechanisms, thereby capturing temporal dependencies more effectively in volatile financial time series. The GRU processes the input sequence to generate hidden states, which are then weighted by an attention mechanism to compute a context vector. This context vector is fed into the LSTM layer, enabling the model to focus on the most relevant time steps and enhance its ability to handle non-stationarity and noise. The integration of GRU-enhanced attention into LSTM allows the model to better capture long-term dependencies and temporal patterns, which are critical for accurate stock prediction. Experimental results demonstrate that the proposed approach outperforms traditional methods in terms of prediction accuracy and robustness, particularly in scenarios with high market volatility. Furthermore, the model’s adaptability to varying time scales and its ability to filter out irrelevant information make it a promising tool for financial time series analysis. The proposed method not only advances the state-of-the-art in stock prediction but also provides a framework for integrating attention mechanisms into other sequential data tasks.

Keywords

Financial Time Series, GRU-Enhanced Attention Mechanism, Hybrid CNN-LSTM Model, Stock Prediction, Deep Learning

PDF