smol_llama-220M-openhermes / generation_config.json
pszemraj's picture
manual upload with upload_folder.py
8561eb8
raw
history blame contribute delete
133 Bytes
{
"_from_model_config": true,
"bos_token_id": 1,
"eos_token_id": 2,
"transformers_version": "4.36.2",
"use_cache": false
}