evo2_1b_base / README.md
gbrixi's picture
Update README.md
6915b21 verified
---
license: apache-2.0
---
## Evo 2
Evo 2 is a state-of-the-art DNA language model trained autoregressively on trillions of DNA tokens.
For instructions, details, and examples, please refer to the [github](https://github.com/ArcInstitute/evo2) and [paper]().
Evo 2 40B and 7B checkpoints, trained up to 1 million sequence length, are available here:
| Checkpoint name | Num layers | Num parameters |
|------------------------------|----|----------|
| [evo2_40b](https://huggingface.co/arcinstitute/evo2_40b) | 50 | 40B |
| [evo2_7b](https://huggingface.co/arcinstitute/evo2_7b) | 32 | 7B |
We also share 40B, 7B, and 1B base checkpoints trained on 8192 context length:
| Checkpoint name | Num layers | Num parameters |
|------------------------------|----|----------|
| [evo2_40b_base](https://huggingface.co/arcinstitute/evo2_40b_base) | 50 | 40B |
| [evo2_7b_base](https://huggingface.co/arcinstitute/evo2_7b_base) | 32 | 7B |
| [evo2_1b_base](https://huggingface.co/arcinstitute/evo2_1b_base) | 25 | 1B |