File size: 407 Bytes
d72c95e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
datasets:
- roneneldan/TinyStories
metrics:
- babylm
---
Basemodel: GPT-Neo
Configs:
Vocab size: 10,000
Hidden size: 512
Max position embeddings: 512
Number of layers: 2
Number of heads: 4
Window size: 256
Intermediate-size: 1024
Results:
- Task: glue
Score: 58.83
Confidence Interval: [57.6, 59.82]
- Task: blimp
Score: 57.60
Confidence Interval: [56.34, 58.83]
|