MegaMath-Llama-3.2-1B

A proof-of-concept model train on MegaMath dataset, capable of both Chain-of-Thought and Program-Aided-Language problem solving.

image/png

Performance

image/png

Citation

If you find our work useful, please cite

@article{zhou2025megamath,
  title     = {MegaMath: Pushing the Limits of Open Math Corpora},
  author    = {Zhou, Fan and Wang, Zengzhi and Ranjan, Nikhil and Cheng, Zhoujun and Tang, Liping and He, Guowei and Liu, Zhengzhong and Xing, Eric P.},
  journal   = {arXiv preprint arXiv:2504.xxxxx},
  year      = {2025},
  note      = {Preprint}
}
Downloads last month
0
Safetensors
Model size
1.24B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train LLM360/MegaMath-Llama-3.2-1B