learning_rate: 7e-6
epoch: 3
context_length: 2048
RAG setting:
search method: bm25
not answering proportion: 0.0
max document length: 5
extra bm25 data: false
prompt:
[
{“role”: “system”, “content”: “從<Document>裡找到答案並利用該答案回答<Query>所敘述的問題”},
{“role”: “system”, “content”: “<Document>{Q}{A}</Document>”},
{“role”: “user”, “content”: “<Query>{Q}</Query>”},
{“role”: “assistant”, “content”: {A}},
]
- Downloads last month
- 3
Model tree for cool9203/Llama3.1-8B-Chinese-Chat-iii_finance-v2
Base model
meta-llama/Llama-3.1-8B
Finetuned
meta-llama/Llama-3.1-8B-Instruct
Quantized
shenzhi-wang/Llama3.1-8B-Chinese-Chat