File size: 1,061 Bytes
fe9707a df5dc87 37d5c29 abcc37a 606e14c 37d5c29 abcc37a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
---
license: mit
language:
- en
---
# GPT-BERT (BabyLM 10M)
Submission to the BabyLM challenge 2024 trained on [Baby-cosmo-fine-10M](https://huggingface.co/datasets/ltg/babylm-2024-baby-cosmo-fine-10m).
The training scripts are published here: https://github.com/ltgoslo/gpt-bert
```bibtex
@inproceedings{charpentier-samuel-2024-bert,
title = "{BERT} or {GPT}: why not both?",
author = "Charpentier, Lucas Georges Gabriel and
Samuel, David",
editor = "Hu, Michael Y. and
Mueller, Aaron and
Ross, Candace and
Williams, Adina and
Linzen, Tal and
Zhuang, Chengxu and
Choshen, Leshem and
Cotterell, Ryan and
Warstadt, Alex and
Wilcox, Ethan Gotlieb",
booktitle = "The 2nd BabyLM Challenge at the 28th Conference on Computational Natural Language Learning",
month = nov,
year = "2024",
address = "Miami, FL, USA",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.conll-babylm.24/",
pages = "262--283",
}
``` |