Add chat_class example
Browse filesAdd example of using the included chat class "Chat" to the readme.
README.md
CHANGED
@@ -42,6 +42,24 @@ inputs = tokenizer.apply_chat_template(messages)
|
|
42 |
print(tokenizer.decode(model.generate(**inputs)[0]))
|
43 |
```
|
44 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
45 |
## Chat template
|
46 |
|
47 |
This model uses the following chat template and does not support a separate
|
|
|
42 |
print(tokenizer.decode(model.generate(**inputs)[0]))
|
43 |
```
|
44 |
|
45 |
+
Alternatively, copy the included `chat_class.py` module into your local
|
46 |
+
directory and just import the `Chat` class:
|
47 |
+
```
|
48 |
+
from chat_class import Chat
|
49 |
+
chat = Chat() # default args: Chat("mathewhe/DCLM-7B-Chat", device="cuda")
|
50 |
+
|
51 |
+
# for one-off instructions
|
52 |
+
instruction = "Write a list of ingredients for banana pudding."
|
53 |
+
print(chat.instruct(instruction))
|
54 |
+
|
55 |
+
# for multi-turn chat
|
56 |
+
response1 = chat.message("Who was Stan Lee?")
|
57 |
+
response2 = chat.message("What was his wife's name?")
|
58 |
+
|
59 |
+
# to reset the chat
|
60 |
+
chat.reset()
|
61 |
+
```
|
62 |
+
|
63 |
## Chat template
|
64 |
|
65 |
This model uses the following chat template and does not support a separate
|