Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
soBeauty
/
vocab_tokenizer
like
0
arxiv:
1910.09700
Model card
Files
Files and versions
Community
3
main
vocab_tokenizer
1 contributor
History:
5 commits
soBeauty
Upload tokenizer
a0da413
about 2 years ago
.gitattributes
Safe
1.48 kB
initial commit
about 2 years ago
README.md
Safe
5.36 kB
Create README.md
about 2 years ago
special_tokens_map.json
Safe
125 Bytes
Upload tokenizer
about 2 years ago
tokenizer.json
Safe
1.47 MB
Upload tokenizer
about 2 years ago
tokenizer_config.json
Safe
360 Bytes
Upload tokenizer
about 2 years ago
vocab.txt
Safe
940 kB
Upload tokenizer
about 2 years ago