RomainDarous's picture
Fiwing the config.json of the model
02b43c3 verified
raw
history blame contribute delete
119 Bytes
{
"sentence_dim": 768,
"token_dim": 768,
"num_heads": 8,
"initialize": "random"
"pooling_type": 0
}