Model Details

kaitchup/Mayonnaise-4in1-02 quantized to 4-bit with GPTQ.

The original model was created using a recipe detailed in this article: The Mayonnaise: Rank First on the Open LLM Leaderboard with TIES-Merging

Model Description

Downloads last month
2
Safetensors
Model size
1.2B params
Tensor type
I32
·
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kaitchup/Mayonnaise-4in1-02-gptq-4bit