Update README.md
Browse files
README.md
CHANGED
@@ -43,8 +43,12 @@ Visit our Project Git here: https://github.com/Digitous/LLM-SLERP-Merge
|
|
43 |
Spherical Linear Interpolation merging produces more coherently smooth merges than standard weight-merge, also known as LERP (Linear) interpolation.
|
44 |
|
45 |
## What Makes Naberius Special?
|
46 |
-
By combining zephyr-7b-sft-beta and OpenHermes-2-Mistral-7B, then adding dolphin-2.2.1-mistral-7b to the result using a minimally destructive merge technique, preserves a large amount of behavior of all three models in a cohesive fashion.
|
47 |
-
|
|
|
|
|
|
|
|
|
48 |
Naberius can't: walk your dog, do your homework, clean your dishes, tell you to turn off the computer and go to bed on time.
|
49 |
# Ensemble Credits:
|
50 |
All models merged to create are LLaMAv2-7B Mistral-7B Series.
|
|
|
43 |
Spherical Linear Interpolation merging produces more coherently smooth merges than standard weight-merge, also known as LERP (Linear) interpolation.
|
44 |
|
45 |
## What Makes Naberius Special?
|
46 |
+
By combining zephyr-7b-sft-beta and OpenHermes-2-Mistral-7B, then adding dolphin-2.2.1-mistral-7b to the result using a minimally destructive merge technique, preserves a large amount of behavior of all three models in a cohesive fashion.
|
47 |
+
|
48 |
+
|
49 |
+
Naberius can: Do coherent roleplay far and beyond any 7B parameter model ever before, as well as follow instructions exceptionally well, especially for a 7B model and as a bonus for being lightweight, incredible inference speed. Naberius has shown some signs of spacial awareness and does adapt to nuance in conversation. All around a pliable, imaginative, and logic oriented 7B that punches upwards to what feels like a 30B or more at times.
|
50 |
+
|
51 |
+
|
52 |
Naberius can't: walk your dog, do your homework, clean your dishes, tell you to turn off the computer and go to bed on time.
|
53 |
# Ensemble Credits:
|
54 |
All models merged to create are LLaMAv2-7B Mistral-7B Series.
|