Ontocord.AI
commited on
Commit
•
db37306
1
Parent(s):
2457e12
Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ This model was generated by averaging the weights of the following models
|
|
14 |
- [Multi-Domain-Expert-Layers/expert-github](https://huggingface.co/Multi-Domain-Expert-Layers/expert-github)
|
15 |
- [Multi-Domain-Expert-Layers/expert-uspto](https://huggingface.co/Multi-Domain-Expert-Layers/expert-uspto)
|
16 |
- [Multi-Domain-Expert-Layers/expert-arxiv](https://huggingface.co/Multi-Domain-Expert-Layers/expert-arxiv)
|
17 |
-
- [theblackcat102/pythia-1b-deduped-sft](theblackcat102/pythia-1b-deduped-sft)
|
18 |
- We also keep a mixture that is primarily one of the above as an expert that can be loaded on demand.
|
19 |
|
20 |
### NOTE: There is a mistake below where we are using a routed expert for pubmed-abstract, but we merged pubmed central
|
|
|
14 |
- [Multi-Domain-Expert-Layers/expert-github](https://huggingface.co/Multi-Domain-Expert-Layers/expert-github)
|
15 |
- [Multi-Domain-Expert-Layers/expert-uspto](https://huggingface.co/Multi-Domain-Expert-Layers/expert-uspto)
|
16 |
- [Multi-Domain-Expert-Layers/expert-arxiv](https://huggingface.co/Multi-Domain-Expert-Layers/expert-arxiv)
|
17 |
+
- [theblackcat102/pythia-1b-deduped-sft](https://huggingface.co/theblackcat102/pythia-1b-deduped-sft)
|
18 |
- We also keep a mixture that is primarily one of the above as an expert that can be loaded on demand.
|
19 |
|
20 |
### NOTE: There is a mistake below where we are using a routed expert for pubmed-abstract, but we merged pubmed central
|