File size: 369 Bytes
b3da54b f6915d5 1850e13 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
---
datasets:
- PatrickHaller/dsir-pile-100M-words
language:
- en
library_name: transformers
---
Our model for the 2024 BabyLM challenge 100M words track.
To download and use this model the [fla](https://github.com/sustcsonglin/flash-linear-attention) package has to be installed:
```bash
pip install -U git+https://github.com/sustcsonglin/flash-linear-attention
``` |