fblgit commited on
Commit
fe4ea44
·
verified ·
1 Parent(s): ec3cd94

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -107,12 +107,14 @@ model-index:
107
  # TheBeagle-v2beta-32B-MGS
108
  This model is an experimental version of our latest innovation: `MGS`. Its up to you to figure out what does it means, but its very explicit.
109
  We didn't applied our known `UNA` algorithm to the forward pass, but they are entirely compatible and operates in different parts of the neural network and in different ways, tho they both can be seen as a regularization technique.
 
110
 
 
111
 
112
  ## MGS
113
  MGS stands for... Many-Geeks-Searching... and thats it. Hint: `1+1 is 2, and 1+1 is not 3`
114
 
115
- We still believe on 1-Epoch should be enough, so we just did 1 Epoch only.
116
 
117
  ## Dataset
118
  Used here the first decent (corpora & size) dataset on the hub: `Magpie-Align/Magpie-Pro-300K-Filtered`
@@ -124,7 +126,7 @@ It achieves the following results on the evaluation set:
124
 
125
  [All versions available](https://huggingface.co/fblgit/TheBeagle-v2beta-MGS-GGUF/tree/main)
126
 
127
- EXL2 by bartowski:
128
  https://huggingface.co/bartowski/TheBeagle-v2beta-32B-MGS-GGUF
129
 
130
 
 
107
  # TheBeagle-v2beta-32B-MGS
108
  This model is an experimental version of our latest innovation: `MGS`. Its up to you to figure out what does it means, but its very explicit.
109
  We didn't applied our known `UNA` algorithm to the forward pass, but they are entirely compatible and operates in different parts of the neural network and in different ways, tho they both can be seen as a regularization technique.
110
+ ![TheBeagle-v2-MGS](https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS/resolve/main/TheBeagle-v2-MGS.png)
111
 
112
+ `.. In the Loving Memory of my LoLa, coming back to your heart ..`
113
 
114
  ## MGS
115
  MGS stands for... Many-Geeks-Searching... and thats it. Hint: `1+1 is 2, and 1+1 is not 3`
116
 
117
+ We still believe on 1-Epoch should be enough, so we just did 1 Epoch only as usual.
118
 
119
  ## Dataset
120
  Used here the first decent (corpora & size) dataset on the hub: `Magpie-Align/Magpie-Pro-300K-Filtered`
 
126
 
127
  [All versions available](https://huggingface.co/fblgit/TheBeagle-v2beta-MGS-GGUF/tree/main)
128
 
129
+ Quantz by bartowski:
130
  https://huggingface.co/bartowski/TheBeagle-v2beta-32B-MGS-GGUF
131
 
132