Update README.md
Browse files
README.md
CHANGED
@@ -35,15 +35,15 @@ model = AutoModel.from_pretrained('tunib/electra-ko-en-base')
|
|
35 |
|***TUNiB-Electra-ko-en-base*** | 133M |84.74 |90.15 | 86.93 | 83.05 | 79.70 | 82.23 | 95.64 | 83.61 / 92.37 |67.86 |
|
36 |
| [KoELECTRA-base-v3](https://github.com/monologg/KoELECTRA) | 110M | 85.92 |90.63 | **88.11** | 84.45 | 82.24 | **85.53** | 95.25 | **84.83 / 93.45** | 67.61 |
|
37 |
| [KcELECTRA-base](https://github.com/Beomi/KcELECTRA) | 124M| 84.75 |**91.71** | 86.90 | 74.80 | 81.65 | 82.65 | **95.78** | 70.60 / 90.11 | **74.49** |
|
38 |
-
| [KoBERT-base](https://github.com/SKTBrain/KoBERT) | 90M |
|
39 |
-
| [KcBERT-base](https://github.com/Beomi/KcBERT) | 110M |
|
40 |
-
| [XLM-Roberta-base](https://github.com/pytorch/fairseq/tree/master/examples/xlmr) | 280M |
|
|
|
41 |
|
42 |
|
43 |
|
44 |
## Results on English downstream tasks
|
45 |
|
46 |
-
|
47 |
|
48 |
| |**# Params** | **Avg.** |**CoLA**<br/>(MCC) | **SST**<br/>(Acc) |MRPC<br/>(Acc)| **STS**<br/>(Spearman) | **QQP**<br/>(Acc) | **MNLI**<br/>(Acc) | **QNLI**<br/>(Acc) | **RTE**<br/>(Acc) |
|
49 |
| :----------------:| :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: | :---------------------------: | :---------------------------: | :---------------------------: |
|
|
|
35 |
|***TUNiB-Electra-ko-en-base*** | 133M |84.74 |90.15 | 86.93 | 83.05 | 79.70 | 82.23 | 95.64 | 83.61 / 92.37 |67.86 |
|
36 |
| [KoELECTRA-base-v3](https://github.com/monologg/KoELECTRA) | 110M | 85.92 |90.63 | **88.11** | 84.45 | 82.24 | **85.53** | 95.25 | **84.83 / 93.45** | 67.61 |
|
37 |
| [KcELECTRA-base](https://github.com/Beomi/KcELECTRA) | 124M| 84.75 |**91.71** | 86.90 | 74.80 | 81.65 | 82.65 | **95.78** | 70.60 / 90.11 | **74.49** |
|
38 |
+
| [KoBERT-base](https://github.com/SKTBrain/KoBERT) | 90M | 81.92 | 89.63 | 86.11 | 80.65 | 79.00 | 79.64 | 93.93 | 52.81 / 80.27 | 66.21 |
|
39 |
+
| [KcBERT-base](https://github.com/Beomi/KcBERT) | 110M | 79.79 | 89.62 | 84.34 | 66.95 | 74.85 | 75.57 | 93.93 | 60.25 / 84.39 | 68.77 |
|
40 |
+
| [XLM-Roberta-base](https://github.com/pytorch/fairseq/tree/master/examples/xlmr) | 280M | 83.03 |89.49 | 86.26 | 82.95 | 79.92 | 79.09 | 93.53 | 64.70 / 88.94 | 64.06 |
|
41 |
+
|
42 |
|
43 |
|
44 |
|
45 |
## Results on English downstream tasks
|
46 |
|
|
|
47 |
|
48 |
| |**# Params** | **Avg.** |**CoLA**<br/>(MCC) | **SST**<br/>(Acc) |MRPC<br/>(Acc)| **STS**<br/>(Spearman) | **QQP**<br/>(Acc) | **MNLI**<br/>(Acc) | **QNLI**<br/>(Acc) | **RTE**<br/>(Acc) |
|
49 |
| :----------------:| :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: | :---------------------------: | :---------------------------: | :---------------------------: |
|