Update README.md
Browse files
README.md
CHANGED
@@ -38,6 +38,6 @@ The model got the **TMMLU+** (0 shot) performance using [EleutherAI/lm-evaluatio
|
|
38 |
|
39 |
|Details on TMMLU+ (0 shot):<br/>Model | Base Model | STEM | Social Science | Humanities | Other | AVG |
|
40 |
|-----------------------------------------------------|:---------------------:|:---------------:|:--------------:|:----------:|:----------:|:-------:|
|
41 |
-
| Taiwan-inquiry_7B_v2.1 |Breeze-7B-Instruct-v1_0| 36.
|
42 |
| Taiwan-inquiry_7B_v2.0 |Breeze-7B-Instruct-v0_1| 36.17 | 43.59 | 35.45 | 37.63 | 38.95 |
|
43 |
| Taiwan-inquiry_7B_v1.0 |Taiwan-LLM-7B-v2.1-chat| 26.74 | 29.47 | 26.83 | 29.61 | 28.83 |
|
|
|
38 |
|
39 |
|Details on TMMLU+ (0 shot):<br/>Model | Base Model | STEM | Social Science | Humanities | Other | AVG |
|
40 |
|-----------------------------------------------------|:---------------------:|:---------------:|:--------------:|:----------:|:----------:|:-------:|
|
41 |
+
| Taiwan-inquiry_7B_v2.1 |Breeze-7B-Instruct-v1_0| 36.06 | 44.61 | 37.49 | 39.61 | 40.29 |
|
42 |
| Taiwan-inquiry_7B_v2.0 |Breeze-7B-Instruct-v0_1| 36.17 | 43.59 | 35.45 | 37.63 | 38.95 |
|
43 |
| Taiwan-inquiry_7B_v1.0 |Taiwan-LLM-7B-v2.1-chat| 26.74 | 29.47 | 26.83 | 29.61 | 28.83 |
|