Datasets:
Tasks:
Time Series Forecasting
Modalities:
Image
Formats:
imagefolder
Size:
< 1K
ArXiv:
Tags:
time-series
multimodality
pretrained-model
foundation-model
multimodal-time-series-foundation-model
License:
ChengsenWang
commited on
Commit
•
cf6cdec
1
Parent(s):
f93eb3c
Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ In this paper, we innovatively model time series as a foreign language and const
|
|
20 |
|
21 |
As depicted in Figure 1(b), during the continuous pre-training stage, we pre-train [LLaMA-2-7B-Base](https://huggingface.co/meta-llama/Llama-2-7b-hf) on [ChengsenWang/ChatTime-1-Pretrain-1M](https://huggingface.co/datasets/ChengsenWang/ChatTime-1-Pretrain-1M), yielding [ChengsenWang/ChatTime-1-7B-Base](https://huggingface.co/ChengsenWang/ChatTime-1-7B-Base).
|
22 |
|
23 |
-
For details on ChatTime models, training data and procedures, and experimental results, please refer to the [arXiv](https://arxiv.org/abs/
|
24 |
|
25 |
![](architecture.png)
|
26 |
|
@@ -36,7 +36,7 @@ The data for continuous pre-training is sourced from two extensive open-source t
|
|
36 |
| 72 | 64 | 8 | 4 |
|
37 |
| 36 | 32 | 4 | 2 |
|
38 |
|
39 |
-
For details on pre-training dataset, please refer to the [arXiv](https://arxiv.org/abs/
|
40 |
|
41 |
## 📝 Citation
|
42 |
|
|
|
20 |
|
21 |
As depicted in Figure 1(b), during the continuous pre-training stage, we pre-train [LLaMA-2-7B-Base](https://huggingface.co/meta-llama/Llama-2-7b-hf) on [ChengsenWang/ChatTime-1-Pretrain-1M](https://huggingface.co/datasets/ChengsenWang/ChatTime-1-Pretrain-1M), yielding [ChengsenWang/ChatTime-1-7B-Base](https://huggingface.co/ChengsenWang/ChatTime-1-7B-Base).
|
22 |
|
23 |
+
For details on ChatTime models, training data and procedures, and experimental results, please refer to the [arXiv](https://arxiv.org/abs/2412.11376).
|
24 |
|
25 |
![](architecture.png)
|
26 |
|
|
|
36 |
| 72 | 64 | 8 | 4 |
|
37 |
| 36 | 32 | 4 | 2 |
|
38 |
|
39 |
+
For details on pre-training dataset, please refer to the [arXiv](https://arxiv.org/abs/2412.11376).
|
40 |
|
41 |
## 📝 Citation
|
42 |
|