The Hugging Face fast tokenizer for LLM-jp ABCI challenge 2023.
The vocab size is 96,867.
Requirements:
- transformers>=4.34.0
- tokenizers>=0.14.0
- torch
Usage:
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("llm-jp/hf-fast-tokenizer-v22b2")
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.