Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ With LLaMA2-Accessory, mixtral-8x7b enjoys the following features:
|
|
13 |
4. Distributed and/or quantized inference
|
14 |
|
15 |
## 🔥 Online Demo
|
16 |
-
We host a web demo at <https://
|
17 |
[evol-codealpaca-v1](https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1) and
|
18 |
[ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k), with LoRA and Bias tuning.
|
19 |
Please note that this is a temporary link, and we will update our official permanent link today.
|
|
|
13 |
4. Distributed and/or quantized inference
|
14 |
|
15 |
## 🔥 Online Demo
|
16 |
+
We host a web demo at <https://5e1109637f49baae47.gradio.live/>, which shows a mixtral-8x7b model finetuned on
|
17 |
[evol-codealpaca-v1](https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1) and
|
18 |
[ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k), with LoRA and Bias tuning.
|
19 |
Please note that this is a temporary link, and we will update our official permanent link today.
|