Update README.md
Browse files
README.md
CHANGED
@@ -28,7 +28,7 @@ Technical details about Evo can be found in our preprint and our accompanying bl
|
|
28 |
|
29 |
As part of our commitment to open science, we release **weights of 15 intermediate pretraining checkpoints** for phase 1 and phase 2 of pretraining. The checkpoints are available as branches of the corresponding HuggingFace repository.
|
30 |
|
31 |
-
**Evo-1 (Phase 2)** is our **longer context model** in the Evo family, trained at a context length of 131k and tested on generation of sequences of length >
|
32 |
|
33 |
| Checkpoint Name | Description |
|
34 |
|----------------------------------------|-------------|
|
|
|
28 |
|
29 |
As part of our commitment to open science, we release **weights of 15 intermediate pretraining checkpoints** for phase 1 and phase 2 of pretraining. The checkpoints are available as branches of the corresponding HuggingFace repository.
|
30 |
|
31 |
+
**Evo-1 (Phase 2)** is our **longer context model** in the Evo family, trained at a context length of 131k and tested on generation of sequences of length >650k
|
32 |
|
33 |
| Checkpoint Name | Description |
|
34 |
|----------------------------------------|-------------|
|