File size: 204 Bytes
1366f2a
29e20c3
c7d2016
 
9b71e10
17c5983
1
2
3
4
5
6
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
torch
transformers
gradio
accelerate
safetensors