The Hugging Face fast tokenizer for LLM-jp ABCI challenge 2023.

The vocab size is 96,867.

Requirements:

  • transformers>=4.34.0
  • tokenizers>=0.14.0
  • torch

Usage:

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("llm-jp/hf-fast-tokenizer-v22b2")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support