Update README.md
Browse files
README.md
CHANGED
@@ -41,6 +41,11 @@ git clone https://github.com/Dao-AILab/flash-attention.git
|
|
41 |
cd flash-attention && pip install . && cd ..
|
42 |
```
|
43 |
|
|
|
|
|
|
|
|
|
|
|
44 |
## Inference
|
45 |
You can utilize our newly contributed HF integration to run inference on our Bamba models:
|
46 |
```python
|
|
|
41 |
cd flash-attention && pip install . && cd ..
|
42 |
```
|
43 |
|
44 |
+
For users using our HF versions of the model, you would need to install the latest transformers which includes our newly merged implementation for our Bamba models:
|
45 |
+
```bash
|
46 |
+
pip install git+https://github.com/huggingface/transformers.git
|
47 |
+
```
|
48 |
+
|
49 |
## Inference
|
50 |
You can utilize our newly contributed HF integration to run inference on our Bamba models:
|
51 |
```python
|