Add support for flash-attention2
#3
by
shigureui
- opened
flash-attention2 rename the flash_attn_unpadded_func
Add support for flash-attention 2.0
install fail,
NVIDIA-SMI 535.54.03 Driver Version: 535.54.03 CUDA Version: 12.2
shigureui
changed pull request status to
closed