Update README.md
Browse files
README.md
CHANGED
@@ -47,7 +47,7 @@ Reminder: ExLlama does not support 3-bit models, so if you wish to try those qua
|
|
47 |
|
48 |
## AutoGPTQ and GPTQ-for-LLaMa requires latest version of Transformers
|
49 |
|
50 |
-
If you plan to use any of these quants with AutoGPTQ or GPTQ-for-LLaMa,
|
51 |
|
52 |
If you're using text-generation-webui and have updated to the latest version, this is done for you automatically.
|
53 |
|
|
|
47 |
|
48 |
## AutoGPTQ and GPTQ-for-LLaMa requires latest version of Transformers
|
49 |
|
50 |
+
If you plan to use any of these quants with AutoGPTQ or GPTQ-for-LLaMa, your Transformers needs to be be using the latest Github code.
|
51 |
|
52 |
If you're using text-generation-webui and have updated to the latest version, this is done for you automatically.
|
53 |
|