Update README.md
Browse files
README.md
CHANGED
@@ -10,10 +10,11 @@ Open_Gpt4_v0.2
|
|
10 |
|
11 |
This model is a TIES merger of Mixtral-8x7B-Instruct-v0.1 and bagel-8x7b-v0.2 with MixtralOrochi8x7B being the Base model.
|
12 |
|
|
|
13 |
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
|
14 |
Noromaid-v0.1-mixtral, openbuddy-mixtral and possibly other models that were not named. My goal was to expand the models capabilities and make it even more useful of a model, maybe even competitive with closed source models like Gpt-4. But for that more testing is required. I hope the community can help me determine if its deserving of its name. 😊
|
15 |
|
16 |
-
This is the second
|
17 |
|
18 |
Base model:
|
19 |
|
|
|
10 |
|
11 |
This model is a TIES merger of Mixtral-8x7B-Instruct-v0.1 and bagel-8x7b-v0.2 with MixtralOrochi8x7B being the Base model.
|
12 |
|
13 |
+
|
14 |
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
|
15 |
Noromaid-v0.1-mixtral, openbuddy-mixtral and possibly other models that were not named. My goal was to expand the models capabilities and make it even more useful of a model, maybe even competitive with closed source models like Gpt-4. But for that more testing is required. I hope the community can help me determine if its deserving of its name. 😊
|
16 |
|
17 |
+
This is the second iteration of this model, using better models in the merger to improve performance (hopefully).
|
18 |
|
19 |
Base model:
|
20 |
|