CMTFormer: Contrastive Multi-Scale Transformer for Long-Term Time Series Forecasting

Authors: Chenhao Ye, Shuai Zhang, and Guangping Xu
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 36-48
Keywords: Long-term time series forecasting, multi-scale temporal modeling, contrastive learning, self-attention.

Abstract

Long-term time series forecasting remains challenging due to complex temporal dependencies, diverse data distributions, and computational inefficiencies with extended sequences. We propose CMTFormer, a novel architecture that addresses these limitations through multi-scale temporal modeling and contrastive learning. Our approach combines adaptive trend decomposition across multiple timescales with a representation learning framework that leverages self-attention mechanisms and dilated convolutions. The proposed multi-scale trend decomposition disentangles time series into interpretable components at varying resolutions, while the contrastive learning strategy enhances feature discrimination by differentiating between semantically related and unrelated temporal patterns. Extensive experiments on six real-world benchmarks spanning energy, transportation, weather, finance, and public health domains demonstrate that CMTFormer consistently outperforms state-of-the-art forecasting models.
📄 View Full Paper (PDF) 📋 Show Citation