pooja-ganesh commited on
Commit
e3904ca
·
verified ·
1 Parent(s): 6c942d1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -24,9 +24,9 @@ tags:
24
  - ## Introduction
25
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
26
  - ## Quantization Strategy
27
- - AWQ / Group 128 / Asymmetric / BF16 activations
28
  - ## Quick Start
29
- For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
30
 
31
  #### Evaluation scores
32
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 6.861743.
 
24
  - ## Introduction
25
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
26
  - ## Quantization Strategy
27
+ - AWQ / Group 128 / Asymmetric / BF16 activations / UINT4 Weights
28
  - ## Quick Start
29
+ For quickstart, refer to npu-llm-artifacts_1.3.0.zip available in [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
30
 
31
  #### Evaluation scores
32
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 6.861743.