divinetaco
commited on
Commit
•
68caac8
1
Parent(s):
50cece1
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ tags:
|
|
11 |
# aranea-ancilla-116b-v1.0
|
12 |
**aka MiquMaid-v1-70B + interleaved WinterGoddess-1.4x-70B-L2**
|
13 |
|
14 |
-
![image/png](https://huggingface.co/divinetaco/aranea-ancilla-116b-v1.0/resolve/main/aranea-ancilla.png)
|
15 |
|
16 |
A [mergekit](https://github.com/arcee-ai/mergekit) frankenmerge based on [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B) with interleaved layers of [Sao10K/WinterGoddess-1.4x-70B-L2](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).
|
17 |
This was the top performing model from a series of merge experiments to create a highly coherant creative writing model.
|
@@ -38,4 +38,4 @@ No license. Component models based on the [Mistral AI Miqu-1](https://huggingfac
|
|
38 |
|
39 |
### Quantizations
|
40 |
|
41 |
-
Exllamav2 quants will be available when bandwidth permits.
|
|
|
11 |
# aranea-ancilla-116b-v1.0
|
12 |
**aka MiquMaid-v1-70B + interleaved WinterGoddess-1.4x-70B-L2**
|
13 |
|
14 |
+
![image/png](https://huggingface.co/divinetaco/aranea-ancilla-116b-v1.0-6.4bpw-exl2/resolve/main/aranea-ancilla.png)
|
15 |
|
16 |
A [mergekit](https://github.com/arcee-ai/mergekit) frankenmerge based on [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B) with interleaved layers of [Sao10K/WinterGoddess-1.4x-70B-L2](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).
|
17 |
This was the top performing model from a series of merge experiments to create a highly coherant creative writing model.
|
|
|
38 |
|
39 |
### Quantizations
|
40 |
|
41 |
+
Exllamav2 quants will be available when bandwidth permits.
|