This repo provides the model weights released in the paper Towards Neural Scaling Laws for Time Series Foundation Models, ICLR 2025
The models have varying sizes, ranging from 1M to 1B parameters, and were trained on datasets spanning from 10M to 16B time points.
Code: https://github.com/Qingrenn/TSFM-ScalingLaws
Dataset: https://huggingface.co/datasets/Qingren/TSFM-ScalingLaws-Dataset
Figure1: Scaling laws for NLL in relation to model size, compute, and dataset size. The blue lines represent ID performance, while the red and green lines show OOD performance on LSF subset and Monash subset.
Figure2: Prediction results of models with sizes 1B, 300M, 100M, and 10M.
@inproceedings{yaotowards,
title={Towards Neural Scaling Laws for Time Series Foundation Models},
author={Yao, Qingren and Yang, Chao-Han Huck and Jiang, Renhe and Liang, Yuxuan and Jin, Ming and Pan, Shirui},
booktitle={The Thirteenth International Conference on Learning Representations}
year={2025}
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.