Temal: A Time Encoding Module Augmented LLM for Financial Forecasting

Authors: Mingjun Ma, Chenyu Wang, Ruyao Xu, Shuxiao Chen, Zhongchen Miao, Jian Gao, Jidong Lu, Guangwei Shi
Conference: ICIC 2024 Posters, Tianjin, China, August 5-8, 2024
Pages: 455-468
Keywords: Time Series Forecasting, Large Language Models, Financial Derivatives

Abstract

In the domain of time series analysis, financial forecasting presents itself as a pinnacle of intricacy. Despite the multitude of models, even those powered by cutting-edge transformer architectures, their practical efficacy on financial datasets has remained unexplored. This challenge stems from the unique nature of financial derivatives: various time scales, multifaceted attributes, and volatile patterns. Therefore, this study introduces an innovative multi-modal fine-tuning framework, which harnesses the semantic comprehension capabilities of Large Language Models (LLMs) and encodes both time-series data and its domain-specific knowledge. To mitigate the shortcomings of LLMs in capturing temporal dynamics, we propose two pivotal innovations: a Time-series Encoding Module (TEM) and a Multi-Patch Method. The TEM seamlessly embeds sophisticated temporal representation algorithms within the LLM architecture. Concurrently, the Multi-Patch Method transforms 1D time series into multiple sets of 2D tensors, each representing distinct temporal segments, thereby enriching the model's temporal analysis capabilities. Our empirical evaluations reveal that the Multi-Patch Method adeptly handles the complex temporal fluctuations across varied intervals. The proposed model outperforms other competing methods, marking a 20.2% enhancement in forecasting accuracy for Turnover Ratio and an 9.1% improvement in zero-shot forecasting performance. Crucially, The TEM and Multi-Patch offer modular improvements for LLM-based time-series forecasting, with potential applications across various domains.
📄 View Full Paper (PDF) 📋 Show Citation