Text Generation
GGUF
English
mixture of experts
Mixture of Experts
8x3B
Llama 3.2 MOE
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prosing
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
horror
mergekit
Inference Endpoints
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -43,7 +43,7 @@ pipeline_tag: text-generation
|
|
43 |
<img src="dark-p-infinite.jpg" style="float:right; width:300px; height:300px; padding:10px;">
|
44 |
|
45 |
It is a LLama 3.2 model, max context of 128k (131,000) using mixture of experts to combine EIGHT top L3.2 3B
|
46 |
-
models
|
47 |
|
48 |
This model's instruction following, and output generation for creative writing, prose, fiction and role play are exceptional.
|
49 |
|
|
|
43 |
<img src="dark-p-infinite.jpg" style="float:right; width:300px; height:300px; padding:10px;">
|
44 |
|
45 |
It is a LLama 3.2 model, max context of 128k (131,000) using mixture of experts to combine EIGHT top L3.2 3B
|
46 |
+
models into one massive powerhouse at 18.4B parameters (equal to 24B - 3 X 8 B).
|
47 |
|
48 |
This model's instruction following, and output generation for creative writing, prose, fiction and role play are exceptional.
|
49 |
|