|
--- |
|
license: other |
|
license_name: yi-license |
|
license_link: https://huggingface.co/01-ai/Yi-34B/blob/main/LICENSE |
|
language: |
|
- en |
|
--- |
|
## Information |
|
|
|
This is a Exl2 quantized version of [Yi-34B](https://huggingface.co/01-ai/Yi-34B) |
|
|
|
Please refer to the original creator for more information. |
|
|
|
Calibration dataset: [wikitext](https://huggingface.co/datasets/wikitext/tree/refs%2Fconvert%2Fparquet/wikitext-2-v1/test) |
|
|
|
## Branches: |
|
|
|
- main: Measurement files |
|
- 4bpw: 4 bits per weight |
|
- 4.5bpw: 4.5 bits per weight |
|
- 4.6bpw: 4.6 bits per weight |
|
- 6bpw: 6 bits per weight |
|
|
|
## Notes |
|
|
|
- 4.5 and 4.6 bpw are added for experimental purposes to see how far a 24GB card can be pushed. |
|
- 6bpw is recommended for the best quality to vram usage ratio (assuming you have enough vram). |
|
- Please ask for more bpws in the community tab if necessary. |
|
|
|
## Donate? |
|
|
|
All my infrastructure and cloud expenses are paid out of pocket. If you'd like to donate, you can do so here: https://ko-fi.com/kingbri |
|
|
|
You should not feel obligated to donate, but if you do, I'd appreciate it. |
|
--- |
|
|