Multi-scale Period-dependent Transformer for Time Series Forecasting

Authors: Jiatian Pi Chenyue Wang Kanlun Tan Xin Wang Qiao Liu
Conference: ICIC 2024 Posters, Tianjin, China, August 5-8, 2024
Pages: 968-979
Keywords: Time-series forecasting Periodicity Transformer

Abstract

The periodicity of the time series is useful in improving the performance of forecasting models by revealing long-term trends, seasonal variations and oscillatory phenomena.
Existing methods usually use a single-scale periodicity assumption at a certain fixed stage, which is uncoupled from the inherent multi-scale and continuous nature of the periodicity.
This leads to a bottleneck in the exploitation of the periodic property of the time series by these methods.
It limits the ability of the models to explore the underlying periodic information for capturing reliable dependencies and constructing them efficiently in forecasts.
To this end, in this paper, we make full use of the multi-scale information of time-series periodicity and construct a continuous periodic relational interaction at multiple different stages.
By modeling dependencies and feature aggregation at the sub-sequence level, we are able to break the bottleneck of underutilization of periodic information.
Specifically, we first extract the inherent stationary periodic measurement of sequence data and embed the multi-layers period pattern to model seasonal regularity.
Second, to capture the long-range periodicity correlation, we propose a novel attention mechanism that performs convergence of representation under the predictive paradigm with efficient sparse filtering based on periodic segments.
Third, we implement the sequence decomposition with multi-period scale to separate precisely tendency and seasonality.
Therefore, the intrinsic patterns of the time series can be reasonably deciphered and analyzed respectively.
Extensive experimental results on five benchmarks show that our method achieves favorable results, especially on the significantly periodic data.
📄 View Full Paper (PDF) 📋 Show Citation