Update README.md
Browse files
README.md
CHANGED
@@ -30,13 +30,12 @@ The model should handle 25-32k context window size.
|
|
30 |
[ko-fi To buy sweets for my cat :3](https://ko-fi.com/icefog72)
|
31 |
|
32 |
## Merge Details
|
33 |
-
> Quants
|
34 |
>- [4.2bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-4.2bpw-exl2)
|
35 |
>- [6.5bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-6.5bpw-exl2)
|
36 |
>- [8bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-8bpw-exl2)
|
37 |
-
|
38 |
-
>
|
39 |
-
>thx mradermacher for
|
40 |
>- [GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-GGUF)
|
41 |
>- [i1-GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-i1-GGUF)
|
42 |
|
|
|
30 |
[ko-fi To buy sweets for my cat :3](https://ko-fi.com/icefog72)
|
31 |
|
32 |
## Merge Details
|
33 |
+
> Exl2 Quants
|
34 |
>- [4.2bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-4.2bpw-exl2)
|
35 |
>- [6.5bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-6.5bpw-exl2)
|
36 |
>- [8bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-8bpw-exl2)
|
37 |
+
|
38 |
+
> thx mradermacher for GGUF
|
|
|
39 |
>- [GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-GGUF)
|
40 |
>- [i1-GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-i1-GGUF)
|
41 |
|