Nicolas-BZRD
's Collections
LLMs Distillation
updated
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation
Loss for LLMs
Paper
•
2402.12030
•
Published
mistralai/Mistral-7B-Instruct-v0.2
Text Generation
•
Updated
•
3.08M
•
•
2.65k
meta-llama/Llama-2-7b-chat-hf
Text Generation
•
Updated
•
1.32M
•
•
4.2k
EleutherAI/pythia-160m-deduped
Text Generation
•
Updated
•
42.3k
•
3
EleutherAI/pythia-410m-deduped
Text Generation
•
Updated
•
21.3k
•
20
EleutherAI/pythia-1b-deduped
Text Generation
•
Updated
•
22.5k
•
18
bigscience/bloomz-560m
Text Generation
•
Updated
•
341k
•
117
bigscience/mt0-base
Text2Text Generation
•
Updated
•
3.32k
•
30
facebook/opt-350m
Text Generation
•
Updated
•
367k
•
132
Viewer
•
Updated
•
98.2k
•
60.1k
•
286
google-research-datasets/qed
Updated
•
168
•
3
Viewer
•
Updated
•
10.6k
•
178
•
8
Viewer
•
Updated
•
274k
•
3.98k
•
170
Viewer
•
Updated
•
14.5k
•
11.9k
•
192