This is a MoE model with a mix of domain agnostic fine-tuned models derived from the base Mistral

Downloads last month
5
Safetensors
Model size
24.2B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for collaiborate-tech/CollAIborate4x7B

Quantizations
1 model