Text Generation
PyTorch
causal-lm
rwkv
BlinkDL commited on
Commit
3e133f4
1 Parent(s): db51479

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -32,9 +32,9 @@ RWKV-4 trained on 100+ world languages (70% English, 15% multilang, 15% code).
32
  Some_Pile + Some_RedPajama + Some_OSCAR + All_Wikipedia + All_ChatGPT_Data_I_can_find
33
 
34
  How to use:
35
- * use latest rwkv pip package (0.7.4+)
36
- * use https://github.com/BlinkDL/ChatRWKV/blob/main/v2/benchmark_world.py to test it
37
- * larger models are stronger even though not fully trained yet
38
 
39
  The differences between World & Raven:
40
  * set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)
 
32
  Some_Pile + Some_RedPajama + Some_OSCAR + All_Wikipedia + All_ChatGPT_Data_I_can_find
33
 
34
  How to use:
35
+ * use https://github.com/josStorer/RWKV-Runner for GUI
36
+ * use latest rwkv pip package (0.8.0+)
37
+ * use https://github.com/BlinkDL/ChatRWKV/blob/main/v2/benchmark_world.py and https://github.com/BlinkDL/ChatRWKV/blob/main/API_DEMO_WORLD.py to test it
38
 
39
  The differences between World & Raven:
40
  * set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)