filipignijic's picture
Update README.md
998cd1c verified
|
raw
history blame
407 Bytes
metadata
datasets:
  - roneneldan/TinyStories
metrics:
  - babylm

Basemodel: GPT-Neo

Configs: Vocab size: 10,000 Hidden size: 512 Max position embeddings: 512 Number of layers: 2 Number of heads: 4 Window size: 256 Intermediate-size: 256

Results:

  • Task: glue Score: 55.15 Confidence Interval: [52.54, 56.73]
  • Task: blimp Score: 55.38 Confidence Interval: [53.68, 56.47]