SmolLM2 Humanized
Collection
Humanized versions of SmolLM2 models
•
6 items
•
Updated
This repo contains the humanized 135M SmolLM2 model in the GGUF Format
More about this model
We advise you to clone llama.cpp
and install it following the official guide. We follow the latest version of llama.cpp.
In the following demonstration, we assume that you are running commands under the repository llama.cpp
.
Since cloning the entire repo may be inefficient, you can manually download the GGUF file that you need or use huggingface-cli
:
pip install -U huggingface_hub
huggingface-cli download AssistantsLab/SmolLM2-135M-humanized_GGUF smollm2-135m-humanized-q4_k_m.gguf --local-dir . --local-dir-use-symlinks False
Filename | Quant type | File Size |
---|---|---|
smollm2-135m-humanized-q2_k.gguf | Q2_K | 88.2MB |
smollm2-135m-humanized-q3_k_s.gguf | Q3_K_S | 88.2MB |
smollm2-135m-humanized-q3_k_m.gguf | Q3_K_M | 93.5MB |
smollm2-135m-humanized-q3_k_l.gguf | Q3_K_L | 97.5MB |
smollm2-135m-humanized-q4_0.gguf | Q4_0 | 91.7MB |
smollm2-135m-humanized-q4_k_s.gguf | Q4_K_S | 102MB |
smollm2-135m-humanized-q4_k_m.gguf | Q4_K_M | 105MB |
smollm2-135m-humanized-q5_0.gguf | Q5_0 | 105MB |
smollm2-135m-humanized-q5_k_s.gguf | Q5_K_S | 110MB |
smollm2-135m-humanized-q5_k_m.gguf | Q5_K_M | 112MB |
smollm2-135m-humanized-q6_k.gguf | Q6_K | 138MB |
smollm2-135m-humanized-q8_0.gguf | Q8_0 | 145MB |
For more information about this model, please visit the original model here.
SmolLM2:
@misc{allal2024SmolLM2,
title={SmolLM2 - with great data, comes great performance},
author={Loubna Ben Allal and Anton Lozhkov and Elie Bakouch and Gabriel Martín Blázquez and Lewis Tunstall and Agustín Piqueres and Andres Marafioti and Cyril Zakka and Leandro von Werra and Thomas Wolf},
year={2024},
}
Human-Like-DPO-Dataset:
@misc{çalık2025enhancinghumanlikeresponseslarge,
title={Enhancing Human-Like Responses in Large Language Models},
author={Ethem Yağız Çalık and Talha Rüzgar Akkuş},
year={2025},
eprint={2501.05032},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2501.05032},
}
UltraFeedback dataset:
@misc{cui2023ultrafeedback,
title={UltraFeedback: Boosting Language Models with High-quality Feedback},
author={Ganqu Cui and Lifan Yuan and Ning Ding and Guanming Yao and Wei Zhu and Yuan Ni and Guotong Xie and Zhiyuan Liu and Maosong Sun},
year={2023},
eprint={2310.01377},
archivePrefix={arXiv},
primaryClass={cs.CL}
}