PΛBLØ ᄃΞ commited on
Commit
f18e1f0
·
unverified ·
1 Parent(s): 629c76c

Update requirements.txt

Browse files

- exllama v0.2.7
- flash_attn v2.7.4

Files changed (1) hide show
  1. requirements.txt +2 -2
requirements.txt CHANGED
@@ -1,6 +1,6 @@
1
  huggingface_hub==0.28.1
2
  tokenizers
3
  numpy==2.2.2
4
- https://github.com/turboderp/exllamav2/releases/download/v0.2.6/exllamav2-0.2.6+cu121.torch2.3.1-cp310-cp310-linux_x86_64.whl
5
- https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
6
  jinja2
 
1
  huggingface_hub==0.28.1
2
  tokenizers
3
  numpy==2.2.2
4
+ https://github.com/turboderp/exllamav2/releases/download/v0.2.7/exllamav2-0.2.7+cu121.torch2.3.1-cp310-cp310-linux_x86_64.whl
5
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4/flash_attn-2.7.4+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
6
  jinja2