Ovis1.6-Llama3.2-3B / requirements.txt
与与
Update requirements
f3fa7b4
raw
history blame contribute delete
191 Bytes
numpy==1.24.3
torch==2.2.0
transformers==4.44.2
https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl