|
--- |
|
library_name: transformers |
|
datasets: |
|
- mlabonne/orpo-dpo-mix-40k |
|
metrics: |
|
- accuracy |
|
base_model: |
|
- EleutherAI/gpt-neo-125m |
|
--- |
|
|
|
# Model Card for Model ID |
|
|
|
Fine Tuned EleutherAI/gpt-neo-125M using dataset of https://huggingface.co/datasets/mlabonne/orpo-dpo-mix-40k: |
|
|
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
Passed argument batch_size = auto:4.0. |
|
Determined largest batch size: 64 |
|
Passed argument batch_size = auto:4.0. |
|
Determined largest batch size: 64 |
|
| Tasks |Version|Filter|n-shot| Metric | |Value | |Stderr| |
|
|---------|------:|------|-----:|--------|---|-----:|---|-----:| |
|
|hellaswag| 1|none | 0|acc |↑ |0.2868|± |0.0045| |
|
| | |none | 0|acc_norm|↑ |0.3050|± |0.0046| |
|
|
|
|