Authors:
Hari Hara Suthan Chittoor
1
;
Paul Robert Griffin
1
;
Ariel Neufeld
2
;
Jayne Thompson
3
;
4
and
Mile Gu
2
Affiliations:
1
School of Computing and Information Systems, Singapore Management University, 178902, Singapore
;
2
Nanyang Quantum Hub, School of Physical and Mathematical Sciences, Nanyang Technological University, Singapore
;
3
Institute of High Performance Computing (IHPC), Agency for Science, Technology and Research (A*STAR), Singapore
;
4
Centre for Quantum Technologies, National University of Singapore, Singapore
Keyword(s):
Quantum Computing, Machine Learning, Time Series Forecasting, Hybrid Model.
Abstract:
Long-term time series forecasting (LTSF) involves predicting a large number of future values of a time series based on the past values. This is an essential task in a wide range of domains including weather forecasting, stock market analysis and disease outbreak prediction. Over the decades LTSF algorithms have transitioned from statistical models to deep learning models like transformer models. Despite the complex architecture of transformer based LTSF models ‘Are Transformers Effective for Time Series Forecasting? (Zeng et al., 2023)’ showed that simple linear models can outperform the state-of-the-art transformer based LTSF models. Recently, quantum machine learning (QML) is evolving as a domain to enhance the capabilities of classical machine learning models. In this paper we initiate the application of QML to LTSF problems by proposing QuLTSF, a simple hybrid QML model for multivariate LTSF. Through extensive experiments on a widely used weather dataset we show the advantages of
QuLTSF over the state-of-the-art classical linear models, in terms of reduced mean squared error and mean absolute error.
(More)