This model uses the LTG-BERT architecture. The model was trained on a combination of the BabyLM Dataset, the TinyStories Dataset, and generated data, in accordance with the rules of the Stric-Small track, and the 10M word budget.

The model was trained with 128 token sequence length

Hyperparameters used and evaluation scores will follow in a subsequent update.

Downloads last month
6,050
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Collection including nikitastheo/BERTtime-Stories-10m-nucleus-1-balanced