icefog72 commited on
Commit
feacee4
1 Parent(s): fb48839

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +48 -1
README.md CHANGED
@@ -114,9 +114,17 @@ model-index:
114
  ---
115
  # IceCoffeeRP-7b (IceCoffeeTest11)
116
 
117
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
118
 
119
  ## Merge Details
 
 
 
 
 
 
 
 
 
120
  ### Merge Method
121
 
122
  This model was merged using the SLERP merge method.
@@ -150,6 +158,45 @@ parameters:
150
  - value: 0.5
151
  dtype: float16
152
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
153
 
154
  # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
155
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest11)
 
114
  ---
115
  # IceCoffeeRP-7b (IceCoffeeTest11)
116
 
 
117
 
118
  ## Merge Details
119
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
120
+ Prompt template: Alpaca, maybe ChatML
121
+
122
+ * measurement.json for quanting exl2 included.
123
+
124
+ - [4.2bpw-exl2](https://huggingface.co/icefog72/IceCoffeeRP-7b-4.2bpw-exl2)
125
+ - [6.5bpw-exl2](https://huggingface.co/icefog72/IceCoffeeRP-7b-6.5bpw-exl2)
126
+ - [8bpw-exl2](https://huggingface.co/icefog72/IceCoffeeRP-7b-8bpw-exl2)
127
+
128
  ### Merge Method
129
 
130
  This model was merged using the SLERP merge method.
 
158
  - value: 0.5
159
  dtype: float16
160
  ```
161
+ ## How to download From the command line
162
+
163
+ I recommend using the `huggingface-hub` Python library:
164
+
165
+ ```shell
166
+ pip3 install huggingface-hub
167
+ ```
168
+
169
+ To download the `main` branch to a folder called `IceCoffeeRP-7b`:
170
+
171
+ ```shell
172
+ mkdir IceTeaRP-7b
173
+ huggingface-cli download icefog72/IceCoffeeRP-7b --local-dir IceCoffeeRP-7b --local-dir-use-symlinks False
174
+ ```
175
+
176
+ <details>
177
+ <summary>More advanced huggingface-cli download usage</summary>
178
+
179
+ If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
180
+
181
+ The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
182
+
183
+ For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
184
+
185
+ To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
186
+
187
+ ```shell
188
+ pip3 install hf_transfer
189
+ ```
190
+
191
+ And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
192
+
193
+ ```shell
194
+ mkdir FOLDERNAME
195
+ HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download MODEL --local-dir FOLDERNAME --local-dir-use-symlinks False
196
+ ```
197
+
198
+ Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
199
+ </details>
200
 
201
  # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
202
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest11)