L-Tuning / config.json
Kowsher's picture
Upload config
5939e2b
raw
history blame
281 Bytes
{
"_name_or_path": "bigscience/bloom-560m",
"hidden_dropout": 0.0,
"hidden_size": 1024,
"n_head": 16,
"n_layer": 24,
"num_hidden_layers": 24,
"pad_token_id": 3,
"pre_seq_len": 3,
"problem_type": "single_label_classification",
"transformers_version": "4.35.2"
}