Figure 7: Prediction curve of single GRU on 000006.SS.
reasons to believe that the prediction accuracy can
be greatly improved by making special model adjust-
ments for specific funds. The reliability of different
prediction results can also be determined by subjec-
tive judgment with the assistance of recent prediction
curves, so long-term prediction of fund trends through
deep learning is feasible.
The purpose of this experiment is to build an aux-
iliary forecaster through deep learning. After all, the
financial market is hugely variable and greatly influ-
enced by the news, so the role of deep learning is
more of a technical prediction. It can only be used
as a reference, but not as a decisive factor. Therefore,
a single-layer GRU neural network model with such
prediction accuracy is already a surprise and fully suf-
ficient. In the future, the process of adjusting the pa-
rameters of neural networks can be streamlined and
made more efficient. The AI-driven training meth-
ods can be adopted to simplify the task significantly.
Continuing this research, our aim is to transform the
auxiliary predictor developed here into a versatile tool
that can benefit a broader audience, facilitating invest-
ment decisions for a wide range of individuals.
REFERENCES
Cao, Q., Leggio, K. B., and Schniederjans, M. J. (2005).
A comparison between fama and french’s model and
artificial neural networks in predicting the chinese
stock market. Computers & Operations Research,
32(10):2499–2512.
Chen, J., Jing, H., Chang, Y., and Liu, Q. (2019). Gated
recurrent unit based recurrent neural network for re-
maining useful life prediction of nonlinear deteriora-
tion process. Reliability Engineering & System Safety,
185:372–382.
Gao, Y., Wang, R., and Zhou, E. (2021). Stock prediction
based on optimized lstm and gru models. Scientific
Programming, 2021:1–8.
Gulli, A. and Pal, S. (2017). Deep learning with Keras.
Packt Publishing Ltd.
Hodson, T. O. (2022). Root-mean-square error (rmse) or
mean absolute error (mae): When to use them or
not. Geoscientific Model Development, 15(14):5481–
5487.
Hsieh, T. J., Hsiao, H. F., and Yeh, W. C. (2011). Forecast-
ing stock markets using wavelet transforms and recur-
rent neural networks: An integrated system based on
artificial bee colony algorithm. Applied soft comput-
ing, 11(2):2510–2525.
Kim, K.-j. (2004). Toward global optimization of case-
based reasoning systems for financial forecasting. Ap-
plied intelligence, 21:239–249.
Liu, Y. W., Wang, Z. P., and Zheng, B. Y. (2019). Appli-
cation of regularized gru-lstm model in stock price
prediction. In 2019 IEEE 5th International Con-
ference on Computer and Communications (ICCC),
pages 1886–1890. IEEE.
Lowenstein, R. (2013). Buffett: The making of an American
capitalist. Random House.
Medsker, L. R. and Jain, L. (2001). Recurrent neural net-
works. Design and Applications, 5(64-67):2.
Moghar, A. and Hamiche, M. (2020). Stock market pre-
diction using lstm recurrent neural network. Procedia
Computer Science, 170:1168–1173.
Mohammadi, M. and Das, S. (2016). Snn: stacked neural
networks. arXiv preprint arXiv:1605.08512.
Nelson, D. M. Q., Pereira, A. C. M., and De Oliveira, R. A.
(2017). Stock market’s price movement prediction
with lstm neural networks. In 2017 International joint
conference on neural networks (IJCNN), pages 1419–
1426. IEEE.
Roondiwala, M., Patel, H., and Varma, S. (2017). Predict-
ing stock prices using lstm. International Journal of
Science and Research (IJSR), 6(4):1754–1756.
Shao, X. L., Ma, D., Liu, Y. W., and Yin, Q. (2017). Short-
term forecast of stock price of multi-branch lstm based
on k-means. In 2017 4th International Conference on
Systems and Informatics (ICSAI), pages 1546–1551.
IEEE.
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I.,
and Salakhutdinov, R. (2014). Dropout: a simple way
to prevent neural networks from overfitting. The Jour-
nal of Machine Learning Research, 15(1):1929–1958.
Yildiz, B., Yalama, A., and Coskun, M. (2008). Forecasting
the istanbul stock exchange national 100 index using
an artificial neural network. An International Journal
of Science, Engineering and Technology, 46:36–39.
Yu, Y., Si, X. S., Hu, C. H., and Zhang, J. X. (2019). A re-
view of recurrent neural networks: Lstm cells and net-
work architectures. Neural computation, 31(7):1235–
1270.
KDIR 2023 - 15th International Conference on Knowledge Discovery and Information Retrieval
354