YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

GLM-130B模型的int4量化版本,可在四张3090Ti的情况下进行推理。 An int4 quantized version of the GLM-130B model that can be inferred with 4 * 3090Ti .

license: apache-2.0

[email protected]

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.