metadata
license: apache-2.0
base_model: openlm-research/open_llama_3b_v2
datasets:
- xaviviro/oasst2_euskera_gpt
tags:
- finetune
- chatml
- gpt4
- catalan
model-index:
- name: SUGARRA-3B
results: []
library_name: transformers
widget:
- text: |
<|im_start|>user
Nor zen Isaac Newton?<|im_end|>
<|im_start|>assistant
language:
- eu
- en
SUGARRA: Modelo 3B experimental ChatML euskaldun.
SUGARRA es el resultado de finetunear el modelo open_llama_3b_v2 con las instrucciones OpenAssistant v2 traducidas automáticamente al euskera usando recursos de Helsinki-NLP y tratadas en formato ChatML.
Prompt Template
SUGARRA usa el prompt template ChatML:
<|im_start|>user
Nor zen Isaac Newton?<|im_end|>
<|im_start|>assistant\n
Referències
@software{xaviviro2024sugarra,
author = {xaviviro},
title = {SUGARRA: Modelo 3B experimental ChatML euskaldun.},
month = January,
year = 2024,
url = {https://huggingface.co/xaviviro/SUGARRA-3B}
}
@software{openlm2023openllama,
author = {Geng, Xinyang and Liu, Hao},
title = {OpenLLaMA: An Open Reproduction of LLaMA},
month = May,
year = 2023,
url = {https://github.com/openlm-research/open_llama}
}
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
@article{touvron2023llama,
title={Llama: Open and efficient foundation language models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}