Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
frankenmoe
Merge
mergekit
lazymergekit
Locutusque/TinyMistral-248M-v2
Locutusque/TinyMistral-248M-v2.5
Locutusque/TinyMistral-248M-v2.5-Instruct
jtatman/tinymistral-v2-pycoder-instruct-248m
Felladrin/TinyMistral-248M-SFT-v4
Locutusque/TinyMistral-248M-v2-Instruct
text-generation-inference
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -41,6 +41,8 @@ TinyMistral-6x248M is a Mixure of Experts (MoE) made with the following models u
|
|
41 |
* [Felladrin/TinyMistral-248M-SFT-v4](https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v4)
|
42 |
* [Locutusque/TinyMistral-248M-v2-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-v2-Instruct)
|
43 |
|
|
|
|
|
44 |
## 🧩 Configuration
|
45 |
|
46 |
```yaml
|
|
|
41 |
* [Felladrin/TinyMistral-248M-SFT-v4](https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v4)
|
42 |
* [Locutusque/TinyMistral-248M-v2-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-v2-Instruct)
|
43 |
|
44 |
+
This model will be further pre-trained on nampdn-ai/mini-peS2o.
|
45 |
+
|
46 |
## 🧩 Configuration
|
47 |
|
48 |
```yaml
|