databricks/databricks-dolly-15k
Viewer • Updated • 15k • 33.5k • 963
KD-Llama-7B is a model distilled from Llama-13B on databricks-dolly-15k with token-level forward KLD.
It is used as a baseline for MiniLLM.
@inproceedings{minillm,
title={MiniLLM: Knowledge Distillation of Large Language Models},
author={Gu, Yuxian and Dong, Li and Wei, Furu and Huang, Minlie},
booktitle={Proceedings of ICLR},
year={2024}
}