Gemma 2B Translation v0.110

  • Eval Loss: 0.59812
  • Train Loss: 0.40320
  • lr: 6e-05
  • optimizer: adamw
  • lr_scheduler_type: cosine

Prompt Template

<bos>### English

Hamsters don't eat cats.

### Korean

ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<eos>
<bos>### Korean

ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.

### English

Hamsters don't eat cats.<eos>

Model Description

Downloads last month
57
Safetensors
Model size
2.51B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for lemon-mint/gemma-2b-translation-v0.110

Base model

beomi/gemma-ko-2b
Finetuned
(11)
this model

Dataset used to train lemon-mint/gemma-2b-translation-v0.110