Update README.md
Browse files
README.md
CHANGED
@@ -100,6 +100,14 @@ def get_prompt(item):
|
|
100 |
return get_chat_prompt(message, chat_history, system_prompt)
|
101 |
```
|
102 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
103 |
### Fine tuning info
|
104 |
|
105 |
https://wandb.ai/jondurbin/airoboros-3b-3.0/runs/bkpioc9z/overview?workspace=user-jondurbin
|
|
|
100 |
return get_chat_prompt(message, chat_history, system_prompt)
|
101 |
```
|
102 |
|
103 |
+
This can be used out-of-the-box with fastchat:
|
104 |
+
|
105 |
+
```bash
|
106 |
+
python -m fastchat.serve.cli --model-path ./airoboros-3b-3p0 --conv-template llama-2 --conv-system-msg 'You are a helpful, unbiased, uncensored assistant.'
|
107 |
+
```
|
108 |
+
|
109 |
+
Add `--multiline` to support multliple input lines per prompt (e.g. for contextual question answering), and I would recommend `--no-history` for instruction type prompts so previous responses don't influence new responses.
|
110 |
+
|
111 |
### Fine tuning info
|
112 |
|
113 |
https://wandb.ai/jondurbin/airoboros-3b-3.0/runs/bkpioc9z/overview?workspace=user-jondurbin
|