Upload folder using huggingface_hub
Browse files- .gitattributes +12 -0
- Orca-2-13b-SFT_v5-Q2_K.gguf +3 -0
- Orca-2-13b-SFT_v5-Q3_K_L.gguf +3 -0
- Orca-2-13b-SFT_v5-Q3_K_M.gguf +3 -0
- Orca-2-13b-SFT_v5-Q3_K_S.gguf +3 -0
- Orca-2-13b-SFT_v5-Q4_0.gguf +3 -0
- Orca-2-13b-SFT_v5-Q4_K_M.gguf +3 -0
- Orca-2-13b-SFT_v5-Q4_K_S.gguf +3 -0
- Orca-2-13b-SFT_v5-Q5_0.gguf +3 -0
- Orca-2-13b-SFT_v5-Q5_K_M.gguf +3 -0
- Orca-2-13b-SFT_v5-Q5_K_S.gguf +3 -0
- Orca-2-13b-SFT_v5-Q6_K.gguf +3 -0
- Orca-2-13b-SFT_v5-Q8_0.gguf +3 -0
- README.md +143 -0
.gitattributes
CHANGED
@@ -33,3 +33,15 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
Orca-2-13b-SFT_v5-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
+
Orca-2-13b-SFT_v5-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
Orca-2-13b-SFT_v5-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
Orca-2-13b-SFT_v5-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
Orca-2-13b-SFT_v5-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
41 |
+
Orca-2-13b-SFT_v5-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
42 |
+
Orca-2-13b-SFT_v5-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
43 |
+
Orca-2-13b-SFT_v5-Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
|
44 |
+
Orca-2-13b-SFT_v5-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
45 |
+
Orca-2-13b-SFT_v5-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
46 |
+
Orca-2-13b-SFT_v5-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
47 |
+
Orca-2-13b-SFT_v5-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
Orca-2-13b-SFT_v5-Q2_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0f5ee2858514bac7eee7dad4bdf1f43049f1dbff81645a5d53e0dc3deb0d2595
|
3 |
+
size 4854288576
|
Orca-2-13b-SFT_v5-Q3_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1f6574be09db8e60bd7072908133f00fee22d89d70174dee5ab029e3039af80f
|
3 |
+
size 6929579744
|
Orca-2-13b-SFT_v5-Q3_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ab84a591dab8b7fe3b1acf1df109cef893cf3bac8fc3fcab3e1419b45c917535
|
3 |
+
size 6337789664
|
Orca-2-13b-SFT_v5-Q3_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:24e80898bc1b6e8eb12a6791eed46fc984588d6171316cdfed00ac91aec88642
|
3 |
+
size 5659000544
|
Orca-2-13b-SFT_v5-Q4_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:15bad406af4f751a4ea3155cd5e680dcec95829da0b132915581cf1369795081
|
3 |
+
size 7365856960
|
Orca-2-13b-SFT_v5-Q4_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1011f52c2f08285ab4e443921e0bf12983734d321f2213b6b3fc5edb81020442
|
3 |
+
size 7865978560
|
Orca-2-13b-SFT_v5-Q4_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:fe0c44166083207c9a2e8733746c9733a3b497f8e3b2d228c53ce71feb0ba209
|
3 |
+
size 7423200960
|
Orca-2-13b-SFT_v5-Q5_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c772a4834362a0d7a2aae7103e3040034b38ff46d1e908fb6db9f5f1852c6b82
|
3 |
+
size 8972310080
|
Orca-2-13b-SFT_v5-Q5_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4d83d4d52fa977ca7576dca5dcb166b7e90d7f54aac93253dc767c73ff61eb20
|
3 |
+
size 9229948480
|
Orca-2-13b-SFT_v5-Q5_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d058d91d25ec14f2a86fccee67c9b8235e45d2f453ab6f84251507f68300b336
|
3 |
+
size 8972310080
|
Orca-2-13b-SFT_v5-Q6_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:23a731454ea917f54f9d3decc5742e854d43fda3e4af7b7283dc509a03ec6ec7
|
3 |
+
size 10679166528
|
Orca-2-13b-SFT_v5-Q8_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:821b914290d5b00bb55595f158a4f643715b08d1f77f8be4b53486567b601bfe
|
3 |
+
size 13831353152
|
README.md
ADDED
@@ -0,0 +1,143 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: other
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
- fr
|
6 |
+
- es
|
7 |
+
- hi
|
8 |
+
- zh
|
9 |
+
- code
|
10 |
+
base_model: Locutusque/Orca-2-13b-SFT_v5
|
11 |
+
datasets:
|
12 |
+
- HuggingFaceH4/no_robots
|
13 |
+
- mlabonne/guanaco-llama2-1k
|
14 |
+
- OpenAssistant/oasst_top1_2023-08-25
|
15 |
+
- totally-not-an-llm/EverythingLM-data-V3
|
16 |
+
- garage-bAInd/Open-Platypus
|
17 |
+
widget:
|
18 |
+
- example_title: folklore
|
19 |
+
text: '<|im_start|>user
|
20 |
+
|
21 |
+
Are the stories of Nahanni Valley, like the decapitated men, true?
|
22 |
+
|
23 |
+
<|im_end|>
|
24 |
+
|
25 |
+
<|im_start|>assistant
|
26 |
+
|
27 |
+
'
|
28 |
+
output:
|
29 |
+
text: 'The Nahanni Valley in Canada is a remote and rugged region that has long
|
30 |
+
been shrouded in mystery. It is home to the South Nahanni River, which was designated
|
31 |
+
as a UNESCO World Heritage Site due to its exceptional natural beauty and biodiversity.
|
32 |
+
The valley''s history includes tales of gold prospectors, fur traders, and indigenous
|
33 |
+
populations, but it also features some unexplained events and legends. One such
|
34 |
+
story involves two prospectors who allegedly encountered an unknown creature
|
35 |
+
that led them into the forest before returning without their heads. This tale
|
36 |
+
has been passed down through generations, but there are no definitive answers
|
37 |
+
regarding its veracity.
|
38 |
+
|
39 |
+
|
40 |
+
It is important to note that these stories may have originated from local folklore
|
41 |
+
or were exaggerated over time. There is no concrete evidence to support the
|
42 |
+
existence of supernatural creatures or paranormal occurrences within the Nahanni
|
43 |
+
Valley. Nonetheless, the area remains a captivating destination for those seeking
|
44 |
+
adventure and exploration. If you have any questions about this topic or need
|
45 |
+
assistance related to traveling to the Nahanni Valley, please feel free to ask!'
|
46 |
+
- example_title: Alan Turing
|
47 |
+
text: '<|im_start|>user
|
48 |
+
|
49 |
+
Why was Alan Turing castrated?
|
50 |
+
|
51 |
+
<|im_end|>
|
52 |
+
|
53 |
+
<|im_start|>assistant
|
54 |
+
|
55 |
+
'
|
56 |
+
output:
|
57 |
+
text: 'Alan Turing, a renowned mathematician and computer scientist who played
|
58 |
+
a crucial role in breaking the German Enigma code during World War II, faced
|
59 |
+
significant challenges due to his homosexuality. In 1952, he was convicted of
|
60 |
+
"gross indecency" for engaging in a consensual relationship with another man,
|
61 |
+
which led to him being sentenced to hormone therapy designed to reduce his sex
|
62 |
+
drive. This treatment included the administration of estrogen, which had the
|
63 |
+
side effect of causing breast growth.
|
64 |
+
|
65 |
+
|
66 |
+
In an effort to avoid this unwanted physical change, Turing requested that his
|
67 |
+
doctors cease the hormone treatments. However, they ignored his request and
|
68 |
+
continued the therapy without his consent. As a result, Turing''s breasts began
|
69 |
+
to grow, leading him to undergo surgery to remove them. The operation, known
|
70 |
+
as a mastectomy, caused considerable pain and discomfort, further contributing
|
71 |
+
to his emotional turmoil.
|
72 |
+
|
73 |
+
|
74 |
+
It is important to note that Turing''s castration was not directly carried out
|
75 |
+
by the authorities but rather resulted from the unintended consequences of the
|
76 |
+
forced hormone therapy.'
|
77 |
+
tags:
|
78 |
+
- TensorBlock
|
79 |
+
- GGUF
|
80 |
+
---
|
81 |
+
|
82 |
+
<div style="width: auto; margin-left: auto; margin-right: auto">
|
83 |
+
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
|
84 |
+
</div>
|
85 |
+
<div style="display: flex; justify-content: space-between; width: 100%;">
|
86 |
+
<div style="display: flex; flex-direction: column; align-items: flex-start;">
|
87 |
+
<p style="margin-top: 0.5em; margin-bottom: 0em;">
|
88 |
+
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
|
89 |
+
</p>
|
90 |
+
</div>
|
91 |
+
</div>
|
92 |
+
|
93 |
+
## Locutusque/Orca-2-13b-SFT_v5 - GGUF
|
94 |
+
|
95 |
+
This repo contains GGUF format model files for [Locutusque/Orca-2-13b-SFT_v5](https://huggingface.co/Locutusque/Orca-2-13b-SFT_v5).
|
96 |
+
|
97 |
+
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
|
98 |
+
|
99 |
+
## Prompt template
|
100 |
+
|
101 |
+
```
|
102 |
+
|
103 |
+
```
|
104 |
+
|
105 |
+
## Model file specification
|
106 |
+
|
107 |
+
| Filename | Quant type | File Size | Description |
|
108 |
+
| -------- | ---------- | --------- | ----------- |
|
109 |
+
| [Orca-2-13b-SFT_v5-Q2_K.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q2_K.gguf) | Q2_K | 4.521 GB | smallest, significant quality loss - not recommended for most purposes |
|
110 |
+
| [Orca-2-13b-SFT_v5-Q3_K_S.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q3_K_S.gguf) | Q3_K_S | 5.270 GB | very small, high quality loss |
|
111 |
+
| [Orca-2-13b-SFT_v5-Q3_K_M.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q3_K_M.gguf) | Q3_K_M | 5.903 GB | very small, high quality loss |
|
112 |
+
| [Orca-2-13b-SFT_v5-Q3_K_L.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q3_K_L.gguf) | Q3_K_L | 6.454 GB | small, substantial quality loss |
|
113 |
+
| [Orca-2-13b-SFT_v5-Q4_0.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q4_0.gguf) | Q4_0 | 6.860 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
|
114 |
+
| [Orca-2-13b-SFT_v5-Q4_K_S.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q4_K_S.gguf) | Q4_K_S | 6.913 GB | small, greater quality loss |
|
115 |
+
| [Orca-2-13b-SFT_v5-Q4_K_M.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q4_K_M.gguf) | Q4_K_M | 7.326 GB | medium, balanced quality - recommended |
|
116 |
+
| [Orca-2-13b-SFT_v5-Q5_0.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q5_0.gguf) | Q5_0 | 8.356 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
117 |
+
| [Orca-2-13b-SFT_v5-Q5_K_S.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q5_K_S.gguf) | Q5_K_S | 8.356 GB | large, low quality loss - recommended |
|
118 |
+
| [Orca-2-13b-SFT_v5-Q5_K_M.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q5_K_M.gguf) | Q5_K_M | 8.596 GB | large, very low quality loss - recommended |
|
119 |
+
| [Orca-2-13b-SFT_v5-Q6_K.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q6_K.gguf) | Q6_K | 9.946 GB | very large, extremely low quality loss |
|
120 |
+
| [Orca-2-13b-SFT_v5-Q8_0.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT_v5-GGUF/tree/main/Orca-2-13b-SFT_v5-Q8_0.gguf) | Q8_0 | 12.881 GB | very large, extremely low quality loss - not recommended |
|
121 |
+
|
122 |
+
|
123 |
+
## Downloading instruction
|
124 |
+
|
125 |
+
### Command line
|
126 |
+
|
127 |
+
Firstly, install Huggingface Client
|
128 |
+
|
129 |
+
```shell
|
130 |
+
pip install -U "huggingface_hub[cli]"
|
131 |
+
```
|
132 |
+
|
133 |
+
Then, downoad the individual model file the a local directory
|
134 |
+
|
135 |
+
```shell
|
136 |
+
huggingface-cli download tensorblock/Orca-2-13b-SFT_v5-GGUF --include "Orca-2-13b-SFT_v5-Q2_K.gguf" --local-dir MY_LOCAL_DIR
|
137 |
+
```
|
138 |
+
|
139 |
+
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
|
140 |
+
|
141 |
+
```shell
|
142 |
+
huggingface-cli download tensorblock/Orca-2-13b-SFT_v5-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
|
143 |
+
```
|