MathGenie's picture
Update README.md
4360e4a verified
|
raw
history blame
891 Bytes
metadata
license: apache-2.0
datasets:
  - MathGenie/MathCode-Pile
language:
  - en
metrics:
  - accuracy
base_model:
  - deepseek-ai/deepseek-math-7b-base
pipeline_tag: text-generation
tags:
  - math

MathCoder2

Introduction

The MathCoder2 models are created by conducting continued pretraining on MathCode-Pile. They are introduced in the paper MathCoder2: Better Math Reasoning from Continued Pretraining on Model-translated Mathematical Code.

The mathematical pretraining dataset includes mathematical code accompanied with natural language reasoning steps, making it a superior resource for models aimed at performing advanced mathematical reasoning tasks.

Evaluation

image/png