Update README.md
Browse files
README.md
CHANGED
@@ -39,14 +39,8 @@ This model has been 4-bit quantized Llada-8B-Base model with [GPTQModel](https:/
|
|
39 |
- **damp_percent**: 0.1
|
40 |
- **damp_auto_increment**: 0.0015
|
41 |
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
| Dataset | GPTQ-4bit | FP16 |
|
46 |
-
|----------------|-------------|------|
|
47 |
-
| mmlu | TODO | 65.9(5) |
|
48 |
-
| cmmlu | TODO | 69.9(5) |
|
49 |
-
| arc_challenge | 45.48 | 47.9(0) |
|
50 |
|
51 |
## Example:
|
52 |
```python
|
|
|
39 |
- **damp_percent**: 0.1
|
40 |
- **damp_auto_increment**: 0.0015
|
41 |
|
42 |
+
|
43 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
44 |
|
45 |
## Example:
|
46 |
```python
|