OpenLLaMA-3B-Chat: Chat Model on top of Open Reproduction of LLaMA

Training Traces

wandb

Prompt Format

<human>: Who is Alan Turing?<|endoftext|><assistant>:

Reference

If you found OpenLLaMA useful in your research or applications, please cite using the following BibTeX:

@software{Yao_FMEngine_Library_for_2023,
  author = {Yao, Xiaozhe},
  doi = {10.5281/zenodo.8314779},
  month = sep,
  title = {{FMEngine: Library for Training Foundation Models}},
  url = {https://github.com/eth-easl/fmengine},
  version = {0.0.1},
  year = {2023}
}
@software{openlm2023openllama,
  author = {Geng, Xinyang and Liu, Hao},
  title = {OpenLLaMA: An Open Reproduction of LLaMA},
  month = May,
  year = 2023,
  url = {https://github.com/openlm-research/open_llama}
}
@software{together2023redpajama,
  author = {Together Computer},
  title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
  month = April,
  year = 2023,
  url = {https://github.com/togethercomputer/RedPajama-Data}
}
@article{touvron2023llama,
  title={Llama: Open and efficient foundation language models},
  author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
  journal={arXiv preprint arXiv:2302.13971},
  year={2023}
}

Limitations and Bias

As with all language models, openllama-3b-chat may generate incorrect or biased content. It's important to keep this in mind when using the model.

Downloads last month
88
Safetensors
Model size
3.43B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.