tau
/

Transformers
PyTorch
English
tau/sled
Inference Endpoints
File size: 111 Bytes
2ab43eb
 
 
 
 
1
2
3
4
5
{
  "tokenizer_class": "SledTokenizer",
  "base_tokenizer": "facebook/bart-base",
  "model_max_length": 16384
}