gpt2-large-coedit / generation_config.json
iliazlobin's picture
complete: train_size: {train_size}, batch_size: {batch_size}, per_epoch_steps: {per_epoch_steps}, epochs: {epochs}, epoch_total_steps: {epoch_total_steps}
f9eaa6a verified
raw
history blame
195 Bytes
{
"_from_model_config": true,
"bos_token_id": 50256,
"eos_token_id": 50256,
"max_new_tokens": 350,
"pad_token_id": 50256,
"padding_side": "left",
"transformers_version": "4.39.3"
}