ChengsenWang commited on
Commit
52fe6c7
1 Parent(s): 62d6544

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ size_categories:
18
 
19
  In this paper, we innovatively model time series as a foreign language and construct ChatTime, a unified framework for time series and text processing. As an out-of-the-box multimodal time series foundation model, ChatTime provides zero-shot forecasting capability and supports bimodal input/output for both time series and text. We design a series of experiments to verify the superior performance of ChatTime across multiple tasks and scenarios, and create four multimodal datasets to address data gaps. The experimental results demonstrate the potential and utility of ChatTime.
20
 
21
- As depicted in Figure 1(b), during the continuous pre-training stage, we pre-train [LLaMA-2-7B-Base](https://huggingface.co/meta-llama/Llama-2-7b-hf) on [ChatTime-1-Pretrain-1M](https://huggingface.co/datasets/ChengsenWang/ChatTime-1-Pretrain-1M), yielding [ChatTime-1-7B-Base](https://huggingface.co/ChengsenWang/ChatTime-1-7B-Base).
22
 
23
  For details on ChatTime models, training data and procedures, and experimental results, please refer to the [arXiv](https://arxiv.org/abs/0000.00000).
24
 
 
18
 
19
  In this paper, we innovatively model time series as a foreign language and construct ChatTime, a unified framework for time series and text processing. As an out-of-the-box multimodal time series foundation model, ChatTime provides zero-shot forecasting capability and supports bimodal input/output for both time series and text. We design a series of experiments to verify the superior performance of ChatTime across multiple tasks and scenarios, and create four multimodal datasets to address data gaps. The experimental results demonstrate the potential and utility of ChatTime.
20
 
21
+ As depicted in Figure 1(b), during the continuous pre-training stage, we pre-train [LLaMA-2-7B-Base](https://huggingface.co/meta-llama/Llama-2-7b-hf) on [ChengsenWang/ChatTime-1-Pretrain-1M](https://huggingface.co/datasets/ChengsenWang/ChatTime-1-Pretrain-1M), yielding [ChengsenWang/ChatTime-1-7B-Base](https://huggingface.co/ChengsenWang/ChatTime-1-7B-Base).
22
 
23
  For details on ChatTime models, training data and procedures, and experimental results, please refer to the [arXiv](https://arxiv.org/abs/0000.00000).
24