license: apache-2.0 | |
language: | |
- en | |
pipeline_tag: text-generation | |
tags: | |
- chat | |
# MobileLLM-600M-MNN | |
## Introduction | |
This model is a 4-bit quantized version of the MNN model exported from [MobileLLM-600M](https://huggingface.co/facebook/MobileLLM-600M) using [llm-export](https://github.com/wangzhaode/llm-export). | |