Not-For-All-Audiences
nsfw
IHaBiS commited on
Commit
22c53fc
·
1 Parent(s): ef2f542

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -0
README.md ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ tags:
4
+ - not-for-all-audiences
5
+ - nsfw
6
+ ---
7
+ ## Exl2 version of [Undi95/FlatDolphinMaid-8x7B](https://huggingface.co/Undi95/FlatDolphinMaid-8x7B)
8
+
9
+ ## branch
10
+ 3.5bh8 : 3.5bpw h8
11
+
12
+ Using ThePile [0007.parquet](https://huggingface.co/datasets/EleutherAI/the_pile_deduplicated/resolve/refs%2Fconvert%2Fparquet/default/train/0007.parquet) as dataset
13
+
14
+ Quantization settings : ```python convert.py -i models/Undi95_FlatDolphinMaid-8x7B -o FlatDolphinMaid-8x7B-temp -cf FlatDolphinMaid-8x7B-3.5bpw-h8-exl2 -c 0007.parquet -l 8192 -b 3.5 -hb 8 -m FlatDolphinMaid-8x7B-measurement.json -ml 8192```
15
+ ### below this line is original readme
16
+ First experimental merge of Noromaid 8x7b (Instruct) and dolphin 8x7b. The idea behind this is to add a little more IQ to the model, because Noromaid was only trained on RP/ERP data. Dolphin 2.7 is the only real Mixtral finetune I consider "usable", and so the merging quest begin again kek.
17
+
18
+ Merged Dolphin 2.7 with Mixtral Base (Dolphin was at 1.0 weight) to get rid of ChatLM, and then I merged Noromaid 8x7b with the output, SLERP method.
19
+
20
+ This model feel better on the IQ chart and have the ~same average ERP score on ayumi bench' than Noromaid 8x7b, but it's softer and more prude too, it also have the typical Mixtral repeat issue at some point. Choose your poison.
21
+
22
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/uZlU0PEPtKPZPLzXcoqJ_.png)
23
+
24
+ <!-- description start -->
25
+ ## Description
26
+
27
+ This repo contains fp16 files of FlatDolphinMaid-8x7B.
28
+
29
+ <!-- description end -->
30
+ <!-- description start -->
31
+ ## Models used
32
+
33
+ - [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)
34
+ - [cognitivecomputations/dolphin-2.7-mixtral-8x7b](https://huggingface.co/cognitivecomputations/dolphin-2.7-mixtral-8x7b)
35
+ - [NeverSleep/Noromaid-v0.1-mixtral-8x7b-Instruct-v3](https://huggingface.co/NeverSleep/Noromaid-v0.1-mixtral-8x7b-Instruct-v3)
36
+
37
+ <!-- description end -->
38
+ <!-- prompt-template start -->
39
+ ### Custom format:
40
+ ```
41
+ ### Instruction:
42
+ {system prompt}
43
+
44
+ ### Input:
45
+ {input}
46
+
47
+ ### Response:
48
+ {reply}
49
+ ```
50
+
51
+ If you want to support me, you can [here](https://ko-fi.com/undiai).