Text Generation
GGUF
English
mixture of experts
Mixture of Experts
4x8B
Llama3 MOE
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prosing
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
horror
mergekit
Inference Endpoints
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -97,7 +97,7 @@ The mixture of experts is set at 2 experts, but you can use 3 or 4 too.
|
|
97 |
|
98 |
That means the power of every model is available during instruction and output generation.
|
99 |
|
100 |
-
This brings
|
101 |
|
102 |
<B>Template:</B>
|
103 |
|
|
|
97 |
|
98 |
That means the power of every model is available during instruction and output generation.
|
99 |
|
100 |
+
This brings unparalleled power to all forms of generation and all use cases.
|
101 |
|
102 |
<B>Template:</B>
|
103 |
|