Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ library_name: transformers
|
|
18 |
|
19 |
# Model Summary
|
20 |
|
21 |
-
> OLMoE-1B-7B is a Mixture-of-Experts LLM with 1B active and 7B total parameters released in
|
22 |
|
23 |
This information and more can also be found on the [**OLMoE GitHub repository**](https://github.com/allenai/OLMoE).
|
24 |
|
|
|
18 |
|
19 |
# Model Summary
|
20 |
|
21 |
+
> OLMoE-1B-7B is a Mixture-of-Experts LLM with 1B active and 7B total parameters released in January 2025 (0125) that is 100% open-source. It is an improved version of OLMoE-09-24, see the [paper appendix](https://arxiv.org/abs/2409.02060) for details.
|
22 |
|
23 |
This information and more can also be found on the [**OLMoE GitHub repository**](https://github.com/allenai/OLMoE).
|
24 |
|