Update README.md
Browse files
README.md
CHANGED
@@ -12,6 +12,7 @@ Eagle 7B is a 7.52B parameter model that:
|
|
12 |
(a linear transformer with 10-100x+ lower inference cost)
|
13 |
- Ranks as the world’s greenest 7B model (per token)
|
14 |
- Trained on 1.1 Trillion Tokens across 100+ languages
|
|
|
15 |
- Outperforms all 7B class models in multi-lingual benchmarks
|
16 |
- Approaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evals
|
17 |
- Trade blows with MPT-7B (1T) in English evals
|
|
|
12 |
(a linear transformer with 10-100x+ lower inference cost)
|
13 |
- Ranks as the world’s greenest 7B model (per token)
|
14 |
- Trained on 1.1 Trillion Tokens across 100+ languages
|
15 |
+
(70% English, 15% multi lang, 15% code)
|
16 |
- Outperforms all 7B class models in multi-lingual benchmarks
|
17 |
- Approaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evals
|
18 |
- Trade blows with MPT-7B (1T) in English evals
|