--- base_model: - mistralai/Mistral-7B-Instruct-v0.2 library_name: transformers tags: - mergekit - merge --- # bigstral-12b-32k-8xMoE Made using mergekit MoE branch with the following config: ``` base_model: abacusai/bigstral-12b-32k gate_mode: random dtype: bfloat16 experts_per_token: 2 experts: - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] - source_model: abacusai/bigstral-12b-32k positive_prompts: [] ```