totally-not-an-llm commited on
Commit
64dd787
1 Parent(s): 8456a85

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -15,7 +15,8 @@ This model is an early test of the EverythingLM dataset and some new experimenta
15
  ### GGML quants:
16
  https://huggingface.co/TheBloke/EverythingLM-13B-16K-GGML
17
 
18
- I've had trouble with the GGML quants, especially the k-quants act very weird. This is an issue that I will investigate for the next version, for now I recommend GPTQ if possible:
 
19
  ### GPTQ quants:
20
  https://huggingface.co/TheBloke/EverythingLM-13B-16K-GPTQ
21
 
 
15
  ### GGML quants:
16
  https://huggingface.co/TheBloke/EverythingLM-13B-16K-GGML
17
 
18
+ Make sure to use correct rope scaling settings:
19
+ `-c 16384 --rope-freq-base 10000 --rope-freq-scale 0.25`
20
  ### GPTQ quants:
21
  https://huggingface.co/TheBloke/EverythingLM-13B-16K-GPTQ
22