Use try-except for flash_attn

#5

This PR is to avoid transformers hard check failure on import dependencies.
If user doesn't install flash_attn, they will not pass the import check for flash_attn. Also same for non-cuda users.

To solve this, we can see that try-except content will be filtered out: https://github.com/huggingface/transformers/blob/main/src/transformers/dynamic_module_utils.py#L155.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment