llama3-8b-math-sft-mix-8-0 / train_results.json
Dynosaur's picture
Model save
f2a1bbc verified
raw
history blame
248 Bytes
{
"epoch": 1.9994890883877088,
"total_flos": 316426276208640.0,
"train_loss": 0.3796345431218359,
"train_runtime": 12695.7608,
"train_samples": 109602,
"train_samples_per_second": 17.266,
"train_steps_per_second": 0.27
}