File size: 2,866 Bytes
8d79e67 768ab86 8d79e67 768ab86 8d79e67 9bf47a6 8d79e67 1a6fbf5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
---
license: llama3
datasets:
- lmg-anon/VNTL-v3.1-1k
- lmg-anon/VNTL-Chat
language:
- ja
- en
library_name: peft
base_model: rinna/llama-3-youko-8b
pipeline_tag: translation
---
# Summary
This is an [LLaMA 3 Youko](https://huggingface.co/rinna/llama-3-youko-8b) qlora created using a slightly modified version of the [VNTL-v3.1-1k](https://huggingface.co/datasets/lmg-anon/VNTL-v3.1-1k) dataset, concatenated with the [VNTL-Chat](https://huggingface.co/datasets/lmg-anon/VNTL-Chat) dataset.
This was trained mostly with the same hyperparameters as the [VNTL 7B v0.3.1 lora](https://huggingface.co/lmg-anon/vntl-7b-v0.3.1-lora), the differences are:
- Added \["<\<METADATA>>", "<\<TRANSLATE>>", "<\<JAPANESE>>", "<\<ENGLISH>>", "<\<CHAT>>", "<\<HUMAN>>", "<\<LLM>>"\] as special tokens.
- Trained the \["embed_tokens", "lm_head"\] layers.
- 10x smaller learning rate, 0.00065 -> 0.000065.
This version also includes a new "chat mode", which was lazily trained just to find out how that would impact the end result. I think this ended up quite good for breaking down or explaining Japanese sentences, but it is terrible for most other things, which is expected given the nature of the VNTL-Chat dataset. To be honest, I wasn't aiming for it to be good, so the fact that it works at all is very nice.
**Eval Loss**: 0.8
## Translation Prompt
This is an prompt example for translation:
```
<<METADATA>>
[character] Name: Uryuu Shingo (ηη ζ°εΎ) | Gender: Male | Aliases: Onii-chan (γε
γ‘γγ)
[character] Name: Uryuu Sakuno (ηη ζ‘δΉ) | Gender: Female
<<TRANSLATE>>
<<JAPANESE>>
[ζ‘δΉ]: γβ¦β¦γγγγ
<<ENGLISH>>
[Sakuno]: γ... Sorry.γ<|end_of_text|>
<<JAPANESE>>
[ζ°εΎ]: γγγγγγγθ¨γ£γ‘γγͺγγ γγ©γθΏ·εγ§γγγ£γγγζ‘δΉγ―ε―ζγγγγγγγγεΏι
γγ‘γγ£γ¦γγγ γδΏΊγ
<<ENGLISH>>
```
The generated translation for that prompt, with temperature 0, is:
```
[Shingo]: γNo, I know this isn't the best way to say it, but I'm glad you got lost. You're so cute that I was worried about you.γ
```
## Chat Prompt
This is an prompt example for chat:
```
<<CHAT>>
<<HUMAN>>
Hi! Can you break this down to me? Like, what is the meaning of "ζ‘δΉγ―ε―ζγγγ"?
"γγγγγγθ¨γ£γ‘γγͺγγ γγ©γθΏ·εγ§γγγ£γγγζ‘δΉγ―ε―ζγγγγγγγγεΏι
γγ‘γγ£γ¦γγγ γδΏΊγ"
<<LLM>>
```
The generated response for that prompt, with temperature 0, is:
```
Certainly! The phrase "ζ‘δΉγ―ε―ζγγγ" literally translates to "Because Sakura-no is cute." In Japanese culture, there is a strong emphasis on physical attractiveness and cuteness, especially when it comes to young girls or women. This phrase suggests that the speaker finds Sakura-no (presumably a girl) to be very attractive and endearing in appearance.
``` |