Lin-K76 commited on
Commit
71fb5c0
·
verified ·
1 Parent(s): 6e463b1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -24,8 +24,8 @@ language:
24
  - **License(s):** [llama3.1](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B/blob/main/LICENSE)
25
  - **Model Developers:** Neural Magic
26
 
27
- Quantized version of [Meta-Llama-3.1-405B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct). It achieves an average recovery of 99.97% on the [OpenLLM](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard) benchmark (version 1), compared to the unquantized model.
28
- <!-- It achieves an average score of 78.69 on the [OpenLLM](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard) benchmark (version 1), whereas the unquantized model achieves 78.67. -->
29
 
30
  ### Model Optimizations
31
 
@@ -163,9 +163,9 @@ This version of the lm-evaluation-harness includes versions of ARC-Challenge and
163
  </td>
164
  <td>96.93
165
  </td>
166
- <td>*being collected
167
  </td>
168
- <td>*
169
  </td>
170
  </tr>
171
  <tr>
@@ -213,9 +213,9 @@ This version of the lm-evaluation-harness includes versions of ARC-Challenge and
213
  </td>
214
  <td><strong>86.63</strong>
215
  </td>
216
- <td><strong>*</strong>
217
  </td>
218
- <td><strong>99.97%</strong>
219
  </td>
220
  </tr>
221
  </table>
 
24
  - **License(s):** [llama3.1](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B/blob/main/LICENSE)
25
  - **Model Developers:** Neural Magic
26
 
27
+ Quantized version of [Meta-Llama-3.1-405B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct).
28
+ It achieves an average score of 96.67 on the [OpenLLM](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard) benchmark (version 1), whereas the unquantized model achieves 96.93.
29
 
30
  ### Model Optimizations
31
 
 
163
  </td>
164
  <td>96.93
165
  </td>
166
+ <td>96.67
167
  </td>
168
+ <td>99.73%
169
  </td>
170
  </tr>
171
  <tr>
 
213
  </td>
214
  <td><strong>86.63</strong>
215
  </td>
216
+ <td><strong>86.55</strong>
217
  </td>
218
+ <td><strong>99.91%</strong>
219
  </td>
220
  </tr>
221
  </table>