Edit model card

MathCoder2

Introduction

The MathCoder2 models are created by conducting continued pretraining on MathCode-Pile. They are introduced in the paper MathCoder2: Better Math Reasoning from Continued Pretraining on Model-translated Mathematical Code.

The mathematical pretraining dataset includes mathematical code accompanied with natural language reasoning steps, making it a superior resource for models aimed at performing advanced mathematical reasoning tasks.

Evaluation

image/png

Downloads last month
2
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for MathGenie/MathCoder2-Llama-3-8B

Finetuned
(307)
this model

Dataset used to train MathGenie/MathCoder2-Llama-3-8B

Collection including MathGenie/MathCoder2-Llama-3-8B