YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Vocabulary Trimmed google/mt5-base: seonjeongh/mt5-base-ko

This model is a trimmed version of google/mt5-base by vocabtrimmer, a tool for trimming vocabulary of language models to compress the model size. Following table shows a summary of the trimming process.

google/mt5-base seonjeongh/mt5-base-ko
parameter_size_full 582,401,280 310,904,064
parameter_size_embedding 384,172,032 112,674,816
vocab_size 250,112 73,356
compression_rate_full 100.0 53.38
compression_rate_embedding 100.0 29.33

Following table shows the parameter used to trim vocabulary.

language dataset dataset_column dataset_name dataset_split target_vocab_size min_frequency
ko vocabtrimmer/mc4_validation text ko validation 2
Downloads last month
12
Safetensors
Model size
311M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.