Update README.md
Browse files
README.md
CHANGED
@@ -3,6 +3,13 @@ license: other
|
|
3 |
language:
|
4 |
- en
|
5 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
MythoMist 7b is, as always, a highly experimental Mistral-based merge based on my latest (still in development) algorithm, which actively benchmarks the model as it's being built in pursuit of a goal set by the user.
|
7 |
|
8 |
**Addendum (2023-11-23)**: A more thorough investigation revealed a flaw in my original algorithm that has since been resolved. I've debated deleting this model as it did not follow its original objective but since there are plenty of folks enjoying it I'll be keeping it around.
|
|
|
3 |
language:
|
4 |
- en
|
5 |
---
|
6 |
+
|
7 |
+
[EXL2](https://github.com/turboderp/exllamav2/tree/master#exllamav2) Quantization of [MythoMist-7b](https://huggingface.co/Gryphe/MythoMist-7b).
|
8 |
+
|
9 |
+
Quantized at 8.13bpw.
|
10 |
+
|
11 |
+
# Original model card
|
12 |
+
|
13 |
MythoMist 7b is, as always, a highly experimental Mistral-based merge based on my latest (still in development) algorithm, which actively benchmarks the model as it's being built in pursuit of a goal set by the user.
|
14 |
|
15 |
**Addendum (2023-11-23)**: A more thorough investigation revealed a flaw in my original algorithm that has since been resolved. I've debated deleting this model as it did not follow its original objective but since there are plenty of folks enjoying it I'll be keeping it around.
|