image/png

A megamerge of 10 selected models merged with the "Model Stock" method, using Poppy Porpoise 0.72 as a base. Pretty good at all tasks, strong focus on RP and storytelling, uncensored (does pretty much everything if you ask it to), large knowledge base.

ST Presets in the repo.

Imatrix Quants available here https://huggingface.co/zeroblu3/NeuralPoppy-EVO-L3-imat.GGUF

Downloads last month
173
Safetensors
Model size
8.03B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for zeroblu3/NeuralPoppy-EVO-L3-8B

Merges
1 model
Quantizations
3 models

Spaces using zeroblu3/NeuralPoppy-EVO-L3-8B 6