filipignijic's picture
Update README.md
57d2403 verified
|
raw
history blame
415 Bytes

datasets: - roneneldan/TinyStories metrics: - glue - super_glue

Basemodel: GPT-Neo Configs: Vocab size: 10,000 Hidden size: 512 Max position embeddings: 512 Number of layers: 2 Number of heads: 4 Window size: 256 Intermediate-size: 1024

Results:

  • Task: glue Score: 58.36 Confidence Interval: [57.95, 58.78]
  • Task: blimp Score: 55.64 Confidence Interval: [54.68, 56.64]