CrystalChat / README.md
yukiontheiceberg's picture
Update README.md
6be81f3
|
raw
history blame
2.84 kB
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
library_name: transformers
tags:
  - llm
  - code

CrystalChat

We present CrystalChat, an instruction following model finetuned from LLM360/CrystalCoder.

Model Trained Tokens ARC HellaSwag MMLU (5-shot) GSM8K Winogrande(5-shot) TruthfulQA Language Avg. HumanEval (pass@1) MBPP (pass@1) Coding Avg. Avg. of Avg.
Mistral-7B-Instruct-v0.1 - 54.86 75.71 55.56 32.00 74.27 55.90 58.05 29.27 31.96 30.62 44.34
CrystalChat 7B 1.4T 51.71 76.12 53.22 28.05 70.64 47.29 53.29 34.12 39.11 36.62 50.07
CodeLlama-7b-Instruct 2.5T 43.35 66.14 42.75 15.92 64.33 39.23 45.29 34.12 38.91 36.52 40.91
Llama-2-7b-Chat 2T 53.07 78.39 48.42 18.88 73.09 45.30 52.86 13.26 17.43 15.35 34.11

Model Description

Loading CrystalChat

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

device = "cuda:0" if torch.cuda.is_available() else "cpu"
tokenizer = AutoTokenizer.from_pretrained("LLM360/CrystalChat", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("LLM360/CrystalChat", trust_remote_code=True).to(device)

prompt = 'int add(int x, int y) {'

input_ids = tokenizer(prompt, return_tensors="pt").input_ids.to(device)
gen_tokens = model.generate(input_ids, do_sample=True, max_length=400)

print("-"*20 + "Output for model"  + 20 * '-')
print(tokenizer.batch_decode(gen_tokens)[0])

Citation

BibTeX:

@misc{liu2023llm360,
      title={LLM360: Towards Fully Transparent Open-Source LLMs}, 
      author={Zhengzhong Liu and Aurick Qiao and Willie Neiswanger and Hongyi Wang and Bowen Tan and Tianhua Tao and Junbo Li and Yuqi Wang and Suqi Sun and Omkar Pangarkar and Richard Fan and Yi Gu and Victor Miller and Yonghao Zhuang and Guowei He and Haonan Li and Fajri Koto and Liping Tang and Nikhil Ranjan and Zhiqiang Shen and Xuguang Ren and Roberto Iriondo and Cun Mu and Zhiting Hu and Mark Schulze and Preslav Nakov and Tim Baldwin and Eric P. Xing},
      year={2023},
      eprint={2312.06550},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}