File size: 330 Bytes
2202faf
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
```
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "shellchat-v1"
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True).to("cuda")
tokenizer = AutoTokenizer.from_pretrained(model_name)

query = "hello world!"
history = []

response = model.chat(query, history, tokenizer)
```