File size: 752 Bytes
43b6c93 5fc31e9 43b6c93 1f2cfd2 43b6c93 bb7aada 9e40937 43b6c93 9e40937 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
---
base_model:
- EleutherAI/pythia-160m
---
# Model Card
The Pythia-160M model is designed for research on language model behavior and interpretability, trained on the [Pile](https://pile.eleuther.ai/) dataset. Here we've evaluated it on HELLASWAG and can be fine-tuned for further experimentation.
## Hellaswag Eval
Evaluated on the Eleuther evaluation harness, revision 100,000 steps
| Tasks |Version|Filter|n-shot| Metric | |Value | |Stderr|
|---------|------:|------|-----:|--------|---|-----:|---|-----:|
|hellaswag| 1|none | 0|acc |↑ |0.2872|± |0.0045|
| | |none | 0|acc_norm|↑ |0.3082|± |0.0046|
## How to Use
Done just an exercise - not intended for deployment or human-facing interactions. |