
4.5 Comparative Result Analysis
To benchmark our proposed LSTM model, we com-
pared our experimental results in Table 2 and Table 3
against four other published IEEE papers:
In all cases, the LSTM model successfully learned
the temporal dependencies in the stock price se-
quences and was able to generalize well to unseen
testing data.
Importantly, no significant divergence was ob-
served between actual and predicted prices toward the
end of the testing period for any of the stocks, con-
firming the model’s stability and robustness.
Furthermore, the prediction plots clearly illus-
trated that the LSTM model was not merely mem-
orizing training data but was genuinely learning un-
derlying patterns and trends that could be generalized
across time periods.
5 DISCUSSION AND
LIMITATIONS
The experimental results demonstrate LSTM’s high
effectiveness for stock price forecasting, with R²
scores exceeding 0.9 for most stocks confirming
successful capture of complex temporal dependen-
cies. Comparative analysis showed LSTM consis-
tently outperforming ARIMA and matching or ex-
ceeding GRU and Transformer performance. Key
limitations include: (1) univariate analysis using only
closing prices, missing benefits from technical indi-
cators like RSI, MACD, or sentiment analysis; (2)
single-step prediction limiting real-world trading ap-
plicability; (3) computational expense compared to
simpler models; and (4) performance degradation on
highly volatile stocks like Tesla during extreme mar-
ket conditions. Despite limitations, the study con-
firms LSTM networks as reliable tools for financial
time series modeling, with architecture suitable for
algorithmic trading systems, risk modeling, and port-
folio optimization strategies.
6 CONCLUSION AND FUTURE
WORK
This study demonstrates LSTM neural networks’ ef-
fectiveness in predicting stock prices using historical
closing price data. Key findings include strong gen-
eralization (R² ¿ 0.93 for most stocks), stable training
without overfitting, robust performance across volatil-
ity levels, and superior benchmark performance com-
pared to traditional ARIMA and competitive deep
learning models. Results confirm LSTM’s power
in capturing non-linear, time-dependent structures in
stock market data. The model’s modularity allows
easy future extensions and real-world applications in
portfolio management and algorithmic trading. Fu-
ture enhancements could incorporate additional fea-
tures (volume, volatility indicators, sentiment analy-
sis), extend to multi-step predictions, combine with
attention mechanisms, include macroeconomic indi-
cators, deploy real-time systems with continuous re-
training, and apply explainability tools for transparent
decision-making.
ACKNOWLEDGMENT
The authors would like to sincerely thank Dr.
Bonaventure Chidube Molokwu at California State
University, Sacramento, for their invaluable guidance,
support, and constructive feedback throughout the du-
ration of this project. Their encouragement and in-
sightful comments were instrumental in shaping the
methodology, experiments, and analysis presented in
this study.
We also acknowledge the availability of open-
source financial data APIs such as Yahoo Finance and
the use of powerful deep learning libraries like Ten-
sorFlow and Scikit-learn, which greatly facilitated the
successful completion of this project.
REFERENCES
Bathla, G. (2020). Stock price prediction using lstm and svr.
In 2020 Sixth International Conference on Parallel,
Distributed and Grid Computing (PDGC), pages 211–
214. IEEE.
Box, G. E. P., Jenkins, G. M., Reinsel, G. C., and Ljung,
G. M. (1978). Time series analysis: Forecasting and
control. The Statistician, 27:265–265.
de Prado, M. M. L. (2018). Advances in financial machine
learning: Numerai’s tournament (seminar slides).
Electrical Engineering eJournal.
Edwards, R. D., MaGee, J., and Bassetti, W. H. C. (2018).
Technical analysis of stock trends. CRC Press, Boca
Raton, 11th edition.
Hochreiter, S. and Schmidhuber, J. (1997). Long short-term
memory. Neural Computation, 9(8):1735–1780.
Hyndman, R. J. and Athanasopoulos, G. (2013). Forecast-
ing: principles and practice. OTexts.
Kumar, A., Alsadoon, A., Prasad, P. W. C., Abdullah, S. H.,
Rashid, T. A., Pham, D. T. H., and Nguyen, T. Q. V.
(2021). Generative adversarial network (gan) and en-
hanced root mean square error (ermse): deep learn-
A Long Short-Term Memory (LSTM) Neural Architecture for Presaging Stock Prices
479