run8-llama2_72B / generation_config.json
shivamag99's picture
Upload folder using huggingface_hub
6d48c6a verified
raw
history blame
230 Bytes
{
"_attn_implementation": "flash_attention_2",
"bos_token_id": 1,
"do_sample": true,
"eos_token_id": 2,
"max_length": 4096,
"pad_token_id": 0,
"temperature": 0.6,
"top_p": 0.9,
"transformers_version": "4.42.2"
}