feat: add eos_token_id to generation_config.json (needed by vllm infer) (#12) 989a689 verified czczup wxsm commited on Aug 22
Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet (#3) 743a544 verified czczup kosung commited on Jul 7