metadata
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- Anthropic/hh-rlhf
Pythia-70m supervised finetuned with Anthropic-hh-rlhf dataset for 1 epoch (sft-model), before DPO paper with same dataset for 1 epoch.
Benchmark evaluations done using lm-evaluation-harness.
See Pythia-70m for original model details (paper).