team-lucid/t5-v1_1-base-ko

Google's T5 Version 1.1 that trained on korean corpus

t5-v1_1-base-ko은 ν•œκ΅­μ–΄ μ½”νΌμŠ€μ—μ„œ ν•™μŠ΅λœ t5 v1.1 λͺ¨λΈμž…λ‹ˆλ‹€.

OOV을 막기 μœ„ν•΄ BBPEλ₯Ό μ‚¬μš©ν•˜μ˜€μœΌλ©°, HyperCLOVAμ—μ„œ ν˜•νƒœμ†Œ 뢄석이 μ„±λŠ₯을 λ†’νžˆλŠ”λ° 도움이 λ˜λŠ” 것을 보고 ν† ν¬λ‚˜μ΄μ € ν•™μŠ΅ κ³Όμ •μ—μ„œ MeCab을 μ΄μš©ν•΄ ν˜•νƒœμ†Œκ°€ μ΄μƒν•˜κ²Œ ν† ν°ν™”λ˜μ§€ μ•Šλ„λ‘ ν•˜μ˜€μŠ΅λ‹ˆλ‹€.

이 μ—°κ΅¬λŠ” κ΅¬κΈ€μ˜ TPU Research Cloud(TRC)λ₯Ό 톡해 지원받은 Cloud TPU둜 ν•™μŠ΅λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

Usage

from transformers import AutoTokenizer, T5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained('team-lucid/t5-v1_1-base-ko')
model = T5ForConditionalGeneration.from_pretrained('team-lucid/t5-v1_1-base-ko')
Downloads last month
17
Safetensors
Model size
248M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Collection including team-lucid/t5-v1_1-base-ko