Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ language:
|
|
8 |
|
9 |
All glory to [Sao10K](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2) (WinterGoddess), [sophosympatheia](https://huggingface.co/sophosympatheia/Midnight-Rose-70B-v1.0) (Midnight Rose),[jondurbin](https://huggingface.co/jondurbin/spicyboros-70b-2.2)(spicyboros) [ChuckMcSneed](https://huggingface.co/datasets/ChuckMcSneed/NeoEvalPlusN_benchmark) (Datasets), [alpindale](https://huggingface.co/alpindale/goliath-120b) (inspired by Goliath), [cg123](https://github.com/cg123/mergekit) (mergkit)
|
10 |
|
11 |
-
A simple personal merge test model, merged for RP, 2.9bpw for 2x3090/2x4090
|
12 |
|
13 |
Tried three 120b recipes, this one performed surprisingly well, very smart and sensitive
|
14 |
|
|
|
8 |
|
9 |
All glory to [Sao10K](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2) (WinterGoddess), [sophosympatheia](https://huggingface.co/sophosympatheia/Midnight-Rose-70B-v1.0) (Midnight Rose),[jondurbin](https://huggingface.co/jondurbin/spicyboros-70b-2.2)(spicyboros) [ChuckMcSneed](https://huggingface.co/datasets/ChuckMcSneed/NeoEvalPlusN_benchmark) (Datasets), [alpindale](https://huggingface.co/alpindale/goliath-120b) (inspired by Goliath), [cg123](https://github.com/cg123/mergekit) (mergkit)
|
10 |
|
11 |
+
A simple personal merge test model, merged for RP, 2.9bpw for 2x3090/2x4090 at 4~8k context
|
12 |
|
13 |
Tried three 120b recipes, this one performed surprisingly well, very smart and sensitive
|
14 |
|