metadata
license: mit
Miquliz-120b-v2.0-FP8-dynamic
This quant was made for infermatic.ai
Dynamic FP8 quant of Miquliz 120B v2.0 made with AutoFP8.
Model Details
- Max Context: 32768 tokens
- Layers: 140
Prompt template: Mistral
<s>[INST] {prompt} [/INST]