ZhangRC commited on
Commit
e24f29a
·
verified ·
1 Parent(s): bd5d404

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -45,10 +45,11 @@ This is RWKV-7 model under flash-linear attention format.
45
  ## Uses
46
 
47
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
48
- Install flash-linear-attention before using this model:
49
 
50
  ```bash
51
  pip install git+https://github.com/fla-org/flash-linear-attention
 
52
  ```
53
 
54
  ### Direct Use
 
45
  ## Uses
46
 
47
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
48
+ Install `flash-linear-attention` and the latest version of `transformers` before using this model:
49
 
50
  ```bash
51
  pip install git+https://github.com/fla-org/flash-linear-attention
52
+ pip install 'transformers>=4.48.0'
53
  ```
54
 
55
  ### Direct Use