Basemodel: GPT-Neo

Configs: Vocab size: 10,000 Hidden size: 512 Max position embeddings: 512 Number of layers: 2 Number of heads: 4 Window size: 256 Intermediate-size: 1024

Results:

  • Task: glue Score: 58.83 Confidence Interval: [57.6, 59.82]
  • Task: blimp Score: 57.60 Confidence Interval: [56.34, 58.83]
Downloads last month
6
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train AISE-TUDelft/Custom-Activations-GPT-PReLU

Collection including AISE-TUDelft/Custom-Activations-GPT-PReLU