File size: 1,648 Bytes
b12a279
 
 
 
 
 
 
 
 
 
 
 
 
 
4b73e2c
 
 
 
 
 
 
 
 
 
b12a279
 
 
 
 
 
bef6b65
b12a279
 
 
 
 
 
 
 
 
 
 
 
9456439
cd2f6c4
 
b12a279
 
 
 
 
 
8ef92a8
b12a279
 
 
 
8ef92a8
b12a279
8ef92a8
b12a279
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
---
model-index:
- name: gpt2-xl
  results:
  - task:
      type: text-generation
    dataset:
      name: Wikitext
      type: wikitext
    metrics:
    - type: perplexity (BASELINE)
      value: 20.37662382338481
    - type: perplexity (BASIC)
      value: 20.394323121837566
  - task:
      type: text-generation
    dataset:
      name: Hellaswag
      type: hellaswag
    metrics:
    - type: accuracy (BASELINE)
      value: 0.4004182433778132
    - type: accuracy (BASIC)
      value: 0.4004182433778132
---
This is a d-Matrix functional reference of the GPT2-XL model.
The reference provides the following functional *configurations*:
  Configuration | Explanation
  :-- | :-- 
  **`BASELINE`** | a reference functionally equivalent to the original model
  **`BASIC`** | all linear algebraic operands quantized to `MXINT8-64`, and all other operations transformed to approximated kernel simulations


### Usage

Install d-Matrix [Dmx_Compressor](https://github.com/d-matrix-ai/dmx-compressor) first.
```sh
pip install dmx_compressor
```

The following is an example model and its evaluation.

```sh
git clone https://github.com/EleutherAI/lm-evaluation-harness
cd lm-evaluation-harness
pip install -e .
```

```python
from dmx.compressor.modeling import DmxModel
import lm_eval

model_args = "pretrained=d-matrix/gpt2-xl,trust_remote_code=True"

lm = lm_eval.api.registry.get_model("hf").create_from_arg_string(model_args, {"batch_size": 1})

# Transform the model with DMX
lm._model = DmxModel.from_torch(lm._model)

eval_results = lm_eval.evaluate(lm, lm_eval.tasks.get_task_dict(["wikitext"]))  # Assign desired task, i.e. "wikitext"
```