Kquant03 commited on
Commit
1b56e7f
1 Parent(s): 1e192fe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -20,7 +20,10 @@ The config looks like this...(detailed version is in the files and versions):
20
  - [rwitz/go-bruins-v2](https://huggingface.co/rwitz/go-bruins-v2) - expert #2
21
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
22
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
 
 
23
 
 
24
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
25
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
26
 
 
20
  - [rwitz/go-bruins-v2](https://huggingface.co/rwitz/go-bruins-v2) - expert #2
21
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
22
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
23
+
24
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/GlhMcDiRhmUOsITmBplVT.png)
25
 
26
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/cK0isGt1Nm2lEXZ9INrfu.png)
27
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
28
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
29