BioQwen: A Small-Parameter, High-Performance Bilingual Model for Biomedical Multi-Tasks
This repository contains the quantized weights for the BioQwen 1.8B version, processed through the MLC-LLM project. When you download the BioQwen.apk via this link, it will automatically download the files from this repository. Therefore, it is generally unnecessary to download and use these files separately.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.