Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
occiglot
/
tokenizer-wiki-bench
like
4
Follow
Occiglot
41
Modalities:
Text
Formats:
parquet
Languages:
Afrikaans
Arabic
Bulgarian
+ 42
Size:
10M - 100M
ArXiv:
arxiv:
2012.15613
Libraries:
Datasets
Dask
Croissant
+ 1
License:
mit
Dataset card
Viewer
Files
Files and versions
Community
1
main
tokenizer-wiki-bench
/
ta
1 contributor
History:
2 commits
mbrack
Upload dataset
aa4534e
verified
9 months ago
clean-00000-of-00004.parquet
181 MB
LFS
Upload dataset
9 months ago
clean-00001-of-00004.parquet
90.6 MB
LFS
Upload dataset
9 months ago
clean-00002-of-00004.parquet
79.7 MB
LFS
Upload dataset
9 months ago
clean-00003-of-00004.parquet
134 MB
LFS
Upload dataset
9 months ago
train-00000-of-00004.parquet
190 MB
LFS
Upload dataset
9 months ago
train-00001-of-00004.parquet
92.9 MB
LFS
Upload dataset
9 months ago
train-00002-of-00004.parquet
81.1 MB
LFS
Upload dataset
9 months ago
train-00003-of-00004.parquet
136 MB
LFS
Upload dataset
9 months ago