Text Generation
Transformers
Safetensors
English
olmoe
Mixture of Experts
olmo
Inference Endpoints
Muennighoff commited on
Commit
9b0c1aa
·
verified ·
1 Parent(s): 81ae7ca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ library_name: transformers
18
 
19
  # Model Summary
20
 
21
- > OLMoE-1B-7B is a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024 (0125). It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.
22
 
23
  This information and more can also be found on the [**OLMoE GitHub repository**](https://github.com/allenai/OLMoE).
24
 
 
18
 
19
  # Model Summary
20
 
21
+ > OLMoE-1B-7B is a Mixture-of-Experts LLM with 1B active and 7B total parameters released in January 2025 (0125) that is 100% open-source. It is an improved version of OLMoE-09-24, see the [paper appendix](https://arxiv.org/abs/2409.02060) for details.
22
 
23
  This information and more can also be found on the [**OLMoE GitHub repository**](https://github.com/allenai/OLMoE).
24