Use try-except for flash_attn import

#4

This PR is to avoid transformers hard check failure on import dependencies.
If user doesn't install flash_attn, they will not pass the import check for flash_attn. Also same for non-cuda users.

To solve this, we can see that try-except content will be filtered out: https://github.com/huggingface/transformers/blob/main/src/transformers/dynamic_module_utils.py#L155.

LiangliangMa changed pull request status to closed

Sign up or log in to comment