--- library_name: transformers license: apache-2.0 base_model: - arcee-ai/SuperNova-Medius tags: - llmcompressor --- # SuperNova-Medius-FP8-Dynamic This is a FP8-quantized version of [arcee-ai/SuperNova-Medius](https://huggingface.co/arcee-ai/SuperNova-Medius) using the [llmcompressor](https://github.com/vllm-project/llm-compressor) library. For more information about the quantization method, please visit [FP8 documentation used for quantization](https://github.com/vllm-project/llm-compressor/tree/main/examples/quantization_w8a8_fp8).