Quantized int4 version of small100
Implemented in my translation project: https://github.com/BLCK-B/Moerkepub
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for BLCK-B/small100-quantized
Base model
alirezamsh/small100