Safetensors
English
olmoe
Mixture of Experts
olmo
Muennighoff commited on
Commit
267f373
·
verified ·
1 Parent(s): 537c372

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -4
README.md CHANGED
@@ -28,10 +28,6 @@ Branches:
28
  - `load-balancing`: Ablation with load balancing loss during SFT
29
  - `non-annealed`: Ablation starting from the checkpoint prior to annealing (branch `step1200000-tokens5033B` of https://hf.co/allenai/OLMoE-1B-7B-0924) rather than the annealed checkpoint (branch `main` of https://hf.co/allenai/OLMoE-1B-7B-0924)
30
 
31
- # Bias, Risks, and Limitations
32
- This adapted OLMo model is a research artifact. It is intended to benefit the research community interested in understanding the safety properties of LLMs and developers building safety tools for LLMs. For this reason, the model does not include a specific safety filter or safety training data.
33
- While the model refuses some requests, it is possible for the model to generate harmful and sensitive content from some user prompts. We recommend developers exercise caution and consider the risks of the applications of this technology. Furthermore, developers should consider implementing safeguards for biases, privacy, and other potential harms when appropriate. Finally, as with every LLM, OLMo may produce factual-sounding outputs that may not be true, so developers and users are encouraged to confirm such outputs before relying on them. All users of this model are responsible for how they use the model.
34
-
35
  # Citation
36
 
37
  ```bibtex
 
28
  - `load-balancing`: Ablation with load balancing loss during SFT
29
  - `non-annealed`: Ablation starting from the checkpoint prior to annealing (branch `step1200000-tokens5033B` of https://hf.co/allenai/OLMoE-1B-7B-0924) rather than the annealed checkpoint (branch `main` of https://hf.co/allenai/OLMoE-1B-7B-0924)
30
 
 
 
 
 
31
  # Citation
32
 
33
  ```bibtex