Update README.md
Browse files
README.md
CHANGED
@@ -201,6 +201,7 @@ and stick to more traditional fare. Yarr!</s>
|
|
201 |
|
202 |
[More Information Needed]
|
203 |
|
|
|
204 |
|
205 |
[More Information Needed]
|
206 |
|
@@ -225,7 +226,10 @@ The following `bitsandbytes` quantization config was used during training:
|
|
225 |
- PEFT 0.6.3.dev0
|
226 |
|
227 |
-->
|
228 |
-
|
229 |
|
230 |
[Zephyr-7B-β](https://arxiv.org/abs/2305.18290) is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
|
231 |
-
[Zephyr-7B technical report](https://arxiv.org/abs/2310.16944)
|
|
|
|
|
|
|
|
201 |
|
202 |
[More Information Needed]
|
203 |
|
204 |
+
## More Information
|
205 |
|
206 |
[More Information Needed]
|
207 |
|
|
|
226 |
- PEFT 0.6.3.dev0
|
227 |
|
228 |
-->
|
229 |
+
#### Summary
|
230 |
|
231 |
[Zephyr-7B-β](https://arxiv.org/abs/2305.18290) is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
|
232 |
+
[Zephyr-7B technical report](https://arxiv.org/abs/2310.16944)
|
233 |
+
|
234 |
+
[LoRA](https://arxiv.org/abs/2305.14314)
|
235 |
+
[QLoRA](https://arxiv.org/abs/2106.09685)
|