metadata
base_model: Alsebay/RainyMotip-2x7B
license: apache-2.0
library_name: transformers
tags:
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- moe
- merge
pipeline_tag: text-generation
inference: false
quantized_by: Suparious
Alsebay/RainyMotip-2x7B AWQ
- Model creator: Alsebay
- Original model: RainyMotip-2x7B
Model Summary
What is it? A 2x7B MoE model for Roleplay(?).
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard
This model is is a Mixure of Experts (MoE) made with the following models:
- udkai/Turdus
- Kquant03/Samlagast-7B-laser-bf16
If you used it, please let me know if it good or not. Thank you :)