L-Tuning / config.json
Kowsher's picture
Upload config
0fd37c7
raw
history blame
299 Bytes
{
"_name_or_path": "bigscience/bloom-560m",
"head_dim": 64,
"hidden_dropout": 0.0,
"hidden_size": 1024,
"n_head": 16,
"n_layer": 24,
"num_hidden_layers": 24,
"pad_token_id": 3,
"pre_seq_len": 4,
"problem_type": "single_label_classification",
"transformers_version": "4.35.2"
}