ChatGLM-6B Mirror

ChatGLM-6B is an open source, bilingual conversational language model based on the General Language Model (GLM) architecture with 6.2 billion parameters. Combined with model quantization techniques, it can be deployed locally on consumer-grade graphics cards (as low as 6GB of video memory at INT4 quantization level). ChatGLM-6B uses similar technology to ChatGPT, optimized for Chinese Q&A and conversation. With approximately 1T identifiers trained in both English and Chinese, and supported by supervised fine-tuning, feedback self-help, and human feedback reinforcement learning, ChatGLM-6B with 6.2 billion parameters is able to generate responses that are fairly consistent with human preferences.

Usage

from modelscope import snapshot_download
model_dir = snapshot_download('Genius-Society/chatglm_6b')

Maintenance

git clone [email protected]:Genius-Society/chatglm_6b
cd chatglm_6b

Reference

[1] ChatGLM-6B

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.