gpt2_lithuanian_small / tokenizer_config.json
Deividas Mataciunas
First model version
cabe18e
raw
history blame contribute delete
62 Bytes
{"special_tokens_map_file": null, "full_tokenizer_file": null}