gpt2-spanish-medium / tokenizer_config.json
DeepESP's picture
Update from Deep ESP
ce30d84
raw
history blame contribute delete
115 Bytes
{"pad_token": "<|endoftext|>", "special_tokens_map_file": "./special_tokens_map.json", "full_tokenizer_file": null}