Update README.md
Browse files
README.md
CHANGED
@@ -39,6 +39,9 @@ Stock for the "True Merge" -- This was a TIES Merge, the reasoning is explained
|
|
39 |
- Sao10K/L3.3-70B-Euryale-v2.3
|
40 |
- (Custom Base Model-Stock Soup -- Recipe Below)
|
41 |
|
|
|
|
|
|
|
42 |
One note here, I wasn't really sure how to state this in the huggingface tags. This model is actually THREE different merges. There's a base history merge, which was rolled into a base model merge, and you can see we merged the bases with our instruct models. Whew. I tried to give a thorough overview of model contributions, but not all of them contribute to the "final" merge directly.
|
43 |
|
44 |
|
|
|
39 |
- Sao10K/L3.3-70B-Euryale-v2.3
|
40 |
- (Custom Base Model-Stock Soup -- Recipe Below)
|
41 |
|
42 |
+
Base
|
43 |
+
- huihui-ai/Llama-3.1-Nemotron-70B-Instruct-HF-abliterated
|
44 |
+
|
45 |
One note here, I wasn't really sure how to state this in the huggingface tags. This model is actually THREE different merges. There's a base history merge, which was rolled into a base model merge, and you can see we merged the bases with our instruct models. Whew. I tried to give a thorough overview of model contributions, but not all of them contribute to the "final" merge directly.
|
46 |
|
47 |
|