mradermacher commited on
Commit
5b19235
·
verified ·
1 Parent(s): d9366e4

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -5,7 +5,7 @@ language:
5
  - zh
6
  library_name: transformers
7
  license: apache-2.0
8
- no_imatrix: "nan detected in blk.31.attn_q.weight"
9
  quantized_by: mradermacher
10
  tags:
11
  - Long Context
@@ -21,7 +21,6 @@ tags:
21
  static quants of https://huggingface.co/THUDM/LongAlign-7B-64k-base
22
 
23
  <!-- provided-files -->
24
- weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
25
  ## Usage
26
 
27
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
 
5
  - zh
6
  library_name: transformers
7
  license: apache-2.0
8
+ no_imatrix: nan detected in blk.31.attn_q.weight
9
  quantized_by: mradermacher
10
  tags:
11
  - Long Context
 
21
  static quants of https://huggingface.co/THUDM/LongAlign-7B-64k-base
22
 
23
  <!-- provided-files -->
 
24
  ## Usage
25
 
26
  If you are unsure how to use GGUF files, refer to one of [TheBloke's