LightGTS: A Lightweight General Time Series Forecasting Model

🚩 News (2025.06) LightGTS has been accepted as ICML 2025.

Introduction

LightGTS

Quick Demos

pip install transformers==4.30.2 # Use this version for stable compatibility

Zero-Shot

from configuration_LightGTS import LightGTSConfig
from modeling_LightGTS import LightGTSForPrediction
import torch
from transformers import AutoModelForCausalLM
from transformers import AutoModelForCausalLM, MODEL_MAPPING
from transformers import AutoConfig
# load pretrain model
LightGTS_config = LightGTSConfig(context_points=528, c_in=1, target_dim=192, patch_len=48, stride=48)
LightGTS_config.save_pretrained("LightGTS-huggingface")

AutoConfig.register("LightGTS",LightGTSConfig)
AutoModelForCausalLM.register(LightGTSConfig, LightGTSForPrediction)

model = AutoModelForCausalLM.from_pretrained(
    "./LightGTS-huggingface",
    trust_remote_code=True
)
# prepare input
batch_size, lookback_length = 1, 576
seqs = torch.randn(batch_size, lookback_length).unsqueeze(-1).float()
# generate forecasting results
forecast_length = 192
outputs = model.generate(seqs, patch_len = 48, stride_len=48, max_output_length=forecast_length, inference_patch_len=48)
print(outputs.shape)

Fine-tune

For usage examples, please see test_finetune.py

Citation

If you find Sundial helpful for your research, please cite our paper:

@article{wang2025lightgts,
  title={LightGTS: A Lightweight General Time Series Forecasting Model},
  author={Wang, Yihang and Qiu, Yuying and Chen, Peng and Shu, Yang and Rao, Zhongwen and Pan, Lujia and Yang, Bin and Guo, Chenjuan},
  journal={arXiv preprint arXiv:2506.06005},
  year={2025}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Paper for DecisionIntelligence/LightGTS