Joint Laboratory of HIT and iFLYTEK Research (HFL)
commited on
Commit
·
2b36af9
1
Parent(s):
b99a429
Update README.md
Browse files
README.md
CHANGED
@@ -4,6 +4,8 @@ language:
|
|
4 |
license: "apache-2.0"
|
5 |
---
|
6 |
|
|
|
|
|
7 |
## Chinese ELECTRA
|
8 |
Google and Stanford University released a new pre-trained model called ELECTRA, which has a much compact model size and relatively competitive performance compared to BERT and its variants.
|
9 |
For further accelerating the research of the Chinese pre-trained model, the Joint Laboratory of HIT and iFLYTEK Research (HFL) has released the Chinese ELECTRA models based on the official code of ELECTRA.
|
|
|
4 |
license: "apache-2.0"
|
5 |
---
|
6 |
|
7 |
+
# This model is trained on 180G data, we recommend using this one than the original version.
|
8 |
+
|
9 |
## Chinese ELECTRA
|
10 |
Google and Stanford University released a new pre-trained model called ELECTRA, which has a much compact model size and relatively competitive performance compared to BERT and its variants.
|
11 |
For further accelerating the research of the Chinese pre-trained model, the Joint Laboratory of HIT and iFLYTEK Research (HFL) has released the Chinese ELECTRA models based on the official code of ELECTRA.
|