Text Generation
PyTorch
English
llama

SeqKD-Llama-7B

paper | code

SeqKD-Llama-7B is a Llama-7B model distilled from Llama-13B on databricks-dolly-15k with sequence-level forward KLD.

It is used as a baseline for MiniLLM.

Other Baselines

Citation

@inproceedings{minillm,
  title={MiniLLM: Knowledge Distillation of Large Language Models},
  author={Gu, Yuxian and Dong, Li and Wei, Furu and Huang, Minlie},
  booktitle={Proceedings of ICLR},
  year={2024}
}
Downloads last month
6
Inference Examples
Unable to determine this model's library. Check the docs .

Dataset used to train MiniLLM/SeqKD-Llama-7B