Update README.md
Browse files
README.md
CHANGED
@@ -40,6 +40,10 @@ The model performs exceptionally well on writing, explanation and discussion tas
|
|
40 |
|
41 |
|
42 |
## Use in 🤗Transformers
|
|
|
|
|
|
|
|
|
43 |
If you want faster inference using flash-attention2, you need to install these dependencies:
|
44 |
```bash
|
45 |
pip install packaging ninja
|
|
|
40 |
|
41 |
|
42 |
## Use in 🤗Transformers
|
43 |
+
First install direct dependencies:
|
44 |
+
```
|
45 |
+
pip install transformers torch sentencepiece
|
46 |
+
```
|
47 |
If you want faster inference using flash-attention2, you need to install these dependencies:
|
48 |
```bash
|
49 |
pip install packaging ninja
|