Gemma 2B Translation v0.122

  • Eval Loss: 0.45365
  • Train Loss: 0.43420
  • lr: 6e-05
  • optimizer: adamw
  • lr_scheduler_type: cosine

Prompt Template

<bos>##English##

Hamsters don't eat cats.

##Korean##

ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<eos>
<bos>##Korean##

ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.

##English##

Hamsters do not eat cats.<eos>

Model Description

Downloads last month
122
Safetensors
Model size
2.51B params
Tensor type
BF16
ยท
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for lemon-mint/gemma-2b-translation-v0.122

Base model

beomi/gemma-ko-2b
Finetuned
(11)
this model

Dataset used to train lemon-mint/gemma-2b-translation-v0.122