Mixtral-8x7B-MoE-RP-Story is a model made primarely for chatting, RP (Roleplay) and storywriting. 2 RP model, 2 chat model, 1 occult model, 1 storywritting model, 1 mathematic model and 1 DPO model was used for a MoE. Bagel was the base.
The DPO chat model is here to help get more human reply.
This is my first try at doing this, so don't hesitate to give feedback!
WARNING: ALL THE "K" GGUF QUANT OF MIXTRAL MODELS SEEMS TO BE BROKEN, PREFER Q4_0, Q5_0 or Q8_0!
Description
This repo contains fp16 files of Mixtral-8x7B-MoE-RP-Story.
Models used
The list of model used and their activator/theme can be found here
Prompt template: Custom
Using Bagel as a base let us a lot of different prompting system theorically, you can see all the prompting available here.
If you want to support me, you can here.
- Downloads last month
- 1,849
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.