rombodawg commited on
Commit
5d4e897
1 Parent(s): aaab769

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -9,7 +9,7 @@ Everyone-Coder-4x7b
9
 
10
  ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642cc1c253e76b4c2286c58e/ECrHQnZnv8UM9GUCQtlWW.jpeg)
11
 
12
- EveryoneLLM series of models are a new Mixtral type model created using experts that were finetunes by the community, for the community. This is the first model to release in the series and it is a coding specific model. EveryoneLLM which will be a more generalized model will be released in the near future after more work is done to fine tune the process of mergin Mistral models into a larger Mixtral model with success.
13
 
14
  The goal of the EveryoneLLM series of models is to be a replacement or an alternative to Mixtral-8x7b that is more suitable for general and specific use, as well as easier to fine tune. Since mistral ai is being secretive about the "secret sause" that makes Mixtral-Instruct such an effective fine tune of the Mixtral-base model, I've decided its time for the community to directly compete with Mistralai on our own.
15
 
@@ -82,6 +82,7 @@ experts:
82
  - "Divide 8 by 2: 8 / 2 = 4"
83
  - "Find the remainder when 9 is divided by 3: 9 % 3 = 0"
84
  - "Calculate the square root of 16: sqrt(16) = 4"
 
85
  - "Simplify the expression (a+b)/(c-d): (a+b)/(c-d)"
86
  - "Factor out the common factor of 2 from 4x + 6y: 2(2x + 3y)"
87
  - "Solve for x in the equation 3x - 7 = 2x + 5: x = 12"
 
9
 
10
  ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642cc1c253e76b4c2286c58e/ECrHQnZnv8UM9GUCQtlWW.jpeg)
11
 
12
+ EveryoneLLM series of models are a new Mixtral type model created using experts that were finetuned by the community, for the community. This is the first model to release in the series and it is a coding specific model. EveryoneLLM which will be a more generalized model will be released in the near future after more work is done to fine tune the process of mergin Mistral models into a larger Mixtral model with success.
13
 
14
  The goal of the EveryoneLLM series of models is to be a replacement or an alternative to Mixtral-8x7b that is more suitable for general and specific use, as well as easier to fine tune. Since mistral ai is being secretive about the "secret sause" that makes Mixtral-Instruct such an effective fine tune of the Mixtral-base model, I've decided its time for the community to directly compete with Mistralai on our own.
15
 
 
82
  - "Divide 8 by 2: 8 / 2 = 4"
83
  - "Find the remainder when 9 is divided by 3: 9 % 3 = 0"
84
  - "Calculate the square root of 16: sqrt(16) = 4"
85
+
86
  - "Simplify the expression (a+b)/(c-d): (a+b)/(c-d)"
87
  - "Factor out the common factor of 2 from 4x + 6y: 2(2x + 3y)"
88
  - "Solve for x in the equation 3x - 7 = 2x + 5: x = 12"