TheMelonGod commited on
Commit
8d35a16
·
verified ·
1 Parent(s): 0c14835

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +45 -3
README.md CHANGED
@@ -1,3 +1,45 @@
1
- ---
2
- license: llama3.1
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: llama3.1
3
+ language:
4
+ - en
5
+ quantized_by: TheMelonGod
6
+ pipeline_tag: text-generation
7
+ tags:
8
+ - quantized
9
+ - safetensors
10
+ - exllamav2
11
+ base_model:
12
+ - Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base
13
+ base_model_relation: quantized
14
+ ---
15
+ ExLlamaV2 quantizations of: [Joseph717171 - Llama-3.1-SuperNova-8B-Lite_TIES_with_Base](https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base)
16
+
17
+
18
+ Quantizations (6hb)
19
+ [8.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/8.0bpw)
20
+ [7.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/7.5bpw)
21
+ [7.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/7.0bpw)
22
+ [6.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/6.5bpw)
23
+ [6.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/6.0bpw)
24
+ [5.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/5.5bpw)
25
+ [5.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/5.0bpw)
26
+ [4.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/4.5bpw)
27
+ [4.25bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/4.25bpw)
28
+ [4.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/4.0bpw)
29
+ [3.75bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/3.75bpw)
30
+ [3.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/3.5bpw)
31
+ [3.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/3.0bpw)
32
+ [2.75bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/2.75bpw)
33
+ [2.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/2.5bpw)
34
+ [2.25bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/2.25bpw)
35
+ [2.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.4-llama3.1-8b-exl2/tree/2.0bpw)
36
+
37
+
38
+
39
+ If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser known models.
40
+
41
+
42
+ If you have any suggestions for improvements or feedback, feel free to reach out. Your input is greatly appreciated and helps me make quantizations better for everyone.
43
+
44
+
45
+ Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!