File size: 1,103 Bytes
64eff2e 94a9749 64eff2e 94a9749 64eff2e 94a9749 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 |
---
language:
- en
tags:
- pytorch
- causal-lm
license: apache-2.0
datasets:
- EleutherAI/pile
---
This is a d-Matrix functional reference of the [GPTJ-6B](https://huggingface.co/EleutherAI/gpt-j-6b) model.
The reference provides the following functional *configurations*:
Configuration | Explanation
:-- | :--
**`BASELINE`** | a reference functionally equivalent to the original model
**`BASIC`** | all linear algebraic operands quantized to `BFP16-64`, and all other operations transformed to approximated kernel simulations
### Usage
Install d-Matrix [ML Tools](https://github.com/d-matrix-ai/dmx-mltools) first.
```sh
pip install dmx-mltools
```
The following is an example model and its evaluation.
```python
from mltools.dmx import pipeline
pipe = pipeline(
task="text-generation",
model="d-matrix/gpt-j-6b",
dmx_config="BASELINE", # see above for other variants
trust_remote_code=True,
# device_map="auto", # enabling model parallel on multi-GPU nodes
)
results = pipe.evaluate(metric=["d-matrix/perplexity", "d-matrix/accuracy"], dataset="LAMBADA")
```
|