LLM-Drop Collection Model weights of paper "What Matters in Transformers? Not All Attention is Needed" (https://arxiv.org/abs/2406.15786) โข 14 items โข Updated Oct 23 โข 4
What Matters in Transformers? Not All Attention is Needed Paper โข 2406.15786 โข Published Jun 22 โข 29
๐ช SmolLM Collection A series of smol LLMs: 135M, 360M and 1.7B. We release base and Instruct models as well as the training corpus and some WebGPU demos โข 12 items โข Updated 4 days ago โข 204