Prot2Text-Large-v1-0 / added_tokens.json
habdine's picture
Upload model and tokenizer
b28f183 verified
raw
history blame contribute delete
58 Bytes
{
"<|graph_token|>": 50257,
"<|stop_token|>": 50258
}