RichardErkhov commited on
Commit
0e4ac71
·
verified ·
1 Parent(s): 3d137ef

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +118 -0
README.md ADDED
@@ -0,0 +1,118 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ Noromaid-7B-0.4-DPO - GGUF
11
+ - Model creator: https://huggingface.co/NeverSleep/
12
+ - Original model: https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [Noromaid-7B-0.4-DPO.Q2_K.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q2_K.gguf) | Q2_K | 2.53GB |
18
+ | [Noromaid-7B-0.4-DPO.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.IQ3_XS.gguf) | IQ3_XS | 2.81GB |
19
+ | [Noromaid-7B-0.4-DPO.IQ3_S.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.IQ3_S.gguf) | IQ3_S | 2.96GB |
20
+ | [Noromaid-7B-0.4-DPO.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
21
+ | [Noromaid-7B-0.4-DPO.IQ3_M.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.IQ3_M.gguf) | IQ3_M | 3.06GB |
22
+ | [Noromaid-7B-0.4-DPO.Q3_K.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q3_K.gguf) | Q3_K | 3.28GB |
23
+ | [Noromaid-7B-0.4-DPO.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
24
+ | [Noromaid-7B-0.4-DPO.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
25
+ | [Noromaid-7B-0.4-DPO.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
26
+ | [Noromaid-7B-0.4-DPO.Q4_0.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q4_0.gguf) | Q4_0 | 3.83GB |
27
+ | [Noromaid-7B-0.4-DPO.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
28
+ | [Noromaid-7B-0.4-DPO.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
29
+ | [Noromaid-7B-0.4-DPO.Q4_K.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q4_K.gguf) | Q4_K | 4.07GB |
30
+ | [Noromaid-7B-0.4-DPO.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
31
+ | [Noromaid-7B-0.4-DPO.Q4_1.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q4_1.gguf) | Q4_1 | 4.24GB |
32
+ | [Noromaid-7B-0.4-DPO.Q5_0.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q5_0.gguf) | Q5_0 | 4.65GB |
33
+ | [Noromaid-7B-0.4-DPO.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
34
+ | [Noromaid-7B-0.4-DPO.Q5_K.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q5_K.gguf) | Q5_K | 4.78GB |
35
+ | [Noromaid-7B-0.4-DPO.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
36
+ | [Noromaid-7B-0.4-DPO.Q5_1.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q5_1.gguf) | Q5_1 | 5.07GB |
37
+ | [Noromaid-7B-0.4-DPO.Q6_K.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q6_K.gguf) | Q6_K | 5.53GB |
38
+ | [Noromaid-7B-0.4-DPO.Q8_0.gguf](https://huggingface.co/RichardErkhov/NeverSleep_-_Noromaid-7B-0.4-DPO-gguf/blob/main/Noromaid-7B-0.4-DPO.Q8_0.gguf) | Q8_0 | 7.17GB |
39
+
40
+
41
+
42
+
43
+ Original model description:
44
+ ---
45
+ license: cc-by-nc-4.0
46
+ ---
47
+
48
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/630dfb008df86f1e5becadc3/VKX2Z2yjZX5J8kXzgeCYO.png)
49
+
50
+
51
+ ---
52
+
53
+
54
+ # This model is a collab between [IkariDev](https://huggingface.co/IkariDev) and [Undi](https://huggingface.co/Undi95)!
55
+
56
+ <!-- description start -->
57
+ ## Description
58
+
59
+ <!-- [Recommended settings - contributed by localfultonextractor](https://files.catbox.moe/ue0tja.json) -->
60
+
61
+ This repo contains fp16 files of Noromaid-7b-v0.4-DPO.
62
+
63
+ [FP16 - by IkariDev and Undi](https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO)
64
+
65
+ <!-- [GGUF - By TheBloke](https://huggingface.co/TheBloke/Athena-v4-GGUF)-->
66
+
67
+ <!-- [GPTQ - By TheBloke](https://huggingface.co/TheBloke/Athena-v4-GPTQ)-->
68
+
69
+ <!-- [exl2[8bpw-8h] - by AzureBlack](https://huggingface.co/AzureBlack/Echidna-13b-v0.3-8bpw-8h-exl2)-->
70
+
71
+ <!-- [AWQ - By TheBloke](https://huggingface.co/TheBloke/Athena-v4-AWQ)-->
72
+
73
+ <!-- [fp16 - by IkariDev+Undi95](https://huggingface.co/IkariDev/Athena-v4)-->
74
+
75
+ [GGUF - by IkariDev and Undi](https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO-GGUF)
76
+ <!-- [OLD(GGUF - by IkariDev+Undi95)](https://huggingface.co/IkariDev/Athena-v4-GGUF)-->
77
+
78
+ ## Ratings:
79
+
80
+ Note: We have permission of all users to upload their ratings, we DONT screenshot random reviews without asking if we can put them here!
81
+
82
+ No ratings yet!
83
+
84
+ If you want your rating to be here, send us a message over on DC and we'll put up a screenshot of it here. DC name is "ikaridev" and "undi".
85
+
86
+ <!-- description end -->
87
+ <!-- prompt-template start -->
88
+
89
+ ## Prompt format: Chatml
90
+ ```
91
+ <|im_start|>system
92
+ {sysprompt}<|im_end|>
93
+ <|im_start|>user
94
+ {input}<|im_end|>
95
+ <|im_start|>assistant
96
+ {output}<|im_end|>
97
+ ```
98
+
99
+ ## Training data used:
100
+ - [no_robots dataset](https://huggingface.co/Undi95/Llama2-13B-no_robots-alpaca-lora) let the model have more human behavior, enhances the output.
101
+ - [Aesir Private RP dataset] New data from a new and never used before dataset, add fresh data, no LimaRP spam, this is 100% new. Thanks to the [MinvervaAI Team](https://huggingface.co/MinervaAI) and, in particular, [Gryphe](https://huggingface.co/Gryphe) for letting us use it!
102
+ - [Another private Aesir dataset]
103
+ - [Another private Aesir dataset]
104
+ - [limarp](https://huggingface.co/datasets/lemonilia/LimaRP)
105
+
106
+ ## DPO training data used:
107
+ - [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs)
108
+ - [NobodyExistsOnTheInternet/ToxicDPOqa](https://huggingface.co/datasets/NobodyExistsOnTheInternet/ToxicDPOqa)
109
+ - [Undi95/toxic-dpo-v0.1-NoWarning](https://huggingface.co/datasets/Undi95/toxic-dpo-v0.1-NoWarning)
110
+
111
+ This is a full finetune.
112
+
113
+ ## Others
114
+
115
+ Undi: If you want to support me, you can [here](https://ko-fi.com/undiai).
116
+
117
+ IkariDev: Visit my [retro/neocities style website](https://ikaridevgit.github.io/) please kek
118
+