Guanzheng commited on
Commit
fa72af3
·
verified ·
1 Parent(s): 9d26e1b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -6
README.md CHANGED
@@ -1,3 +1,8 @@
 
 
 
 
 
1
  # CLEX: Continuous Length Extrapolation for Large Language Models
2
  This repo stores the checkpoint of CLEX-Phi-2-2.7B-32K.
3
 
@@ -18,10 +23,10 @@ If you have any questions, feel free to contact us. (Emails: guanzzh.chen@gmail.
18
  |:-----|:-----|:-----------|:-----------|:-----------|:-----------|:------:|
19
  | CLEX-LLaMA-2-7B-16K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-16K) |
20
  | CLEX-LLaMA-2-7B-Chat-16K | chat | CLEX-7B-16K | [UltraChat](https://github.com/thunlp/UltraChat) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-Chat-16K) |
21
- | CLEX-LLaMA-2-7B-64K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 64k | 256K | Pending Upload |
22
- | CLEX-Phi-2-7B-32K | base | Phi-2-2.7B | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | 128K | Pending Upload |
23
- | CLEX-Mixtral-8x7B-32K | base | Mixtral-8x7B-v0.1 | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | >128K | Pending Upload |
24
- | CLEX-Mixtral-8x7B-Chat-32k | chat | CLEX-Mixtral-8x7B-32K | [Ultrachat 200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) | 32k | >128K | Pending Upload |
25
  </div>
26
 
27
 
@@ -66,5 +71,4 @@ If you find our project useful, hope you can star our repo and cite our paper as
66
  journal = {arXiv preprint arXiv:2310.16450},
67
  url = {https://arxiv.org/abs/2310.16450}
68
  }
69
- ```
70
-
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ ---
6
  # CLEX: Continuous Length Extrapolation for Large Language Models
7
  This repo stores the checkpoint of CLEX-Phi-2-2.7B-32K.
8
 
 
23
  |:-----|:-----|:-----------|:-----------|:-----------|:-----------|:------:|
24
  | CLEX-LLaMA-2-7B-16K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-16K) |
25
  | CLEX-LLaMA-2-7B-Chat-16K | chat | CLEX-7B-16K | [UltraChat](https://github.com/thunlp/UltraChat) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-Chat-16K) |
26
+ | CLEX-LLaMA-2-7B-64K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 64k | 256K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-LLaMA-2-7B-64K) |
27
+ | CLEX-Phi-2-7B-32K | base | Phi-2-2.7B | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | 128K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-Phi-2-2.7B-32K) |
28
+ | CLEX-Mixtral-8x7B-32K | base | Mixtral-8x7B-v0.1 | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | >128K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-Mixtral-8x7B-32K) |
29
+ | CLEX-Mixtral-8x7B-Chat-32k | chat | CLEX-Mixtral-8x7B-32K | [Ultrachat 200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) | 32k | >128K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-Mixtral-8x7B-Chat-32K) |
30
  </div>
31
 
32
 
 
71
  journal = {arXiv preprint arXiv:2310.16450},
72
  url = {https://arxiv.org/abs/2310.16450}
73
  }
74
+ ```