p208p2002's picture
Update README.md
3bee572 verified
---
language:
- zh
- en
tags:
- glm
- chatglm
- thudm
---
# ChatGLM3-6B
此版本與 [THUDM/ChatGLM3-6B](https://huggingface.co/THUDM/chatglm3-6b) 無異,
僅對tokenizer增加了`tokenizer.chat_emplate``special_tokens`
修改後的版本能使`tokenizer`運用 hf tokenizer 的`.apply_chat_template`,統一API用法。
修正已經同步提交至`THUDM/chatglm3-6b`並且等待合併:
- https://huggingface.co/THUDM/chatglm3-6b/discussions/22
- https://huggingface.co/THUDM/chatglm3-6b/discussions/36
```python
from transformers import AutoTokenizer,AutoModelForCausalLM
model_id_or_path = "p208p2002/chatglm3-6b-chat-template"
tokenizer = AutoTokenizer.from_pretrained(model_id_or_path,trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id_or_path,device_map="auto",trust_remote_code=True)
inputs = tokenizer.apply_chat_template([
{"role":"system","content":"你是一位樂於助人、尊重他人且誠實的助理。請始終以最有幫助的方式回答問題。如果你對某個問題不知道答案,請不要提供虛假信息。"},
{"role":"user","content":"如何減緩地球暖化?"}
],add_generation_prompt=True,tokenize=True,return_tensors="pt")
out = model.generate(inputs,max_new_tokens=256)
print(tokenizer.decode(out[0]))
```