Fill-Mask
Transformers
PyTorch
Chinese
bert
Inference Endpoints
minghui727 commited on
Commit
e9e3272
·
1 Parent(s): cc8f917

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -7
README.md CHANGED
@@ -9,12 +9,9 @@ tags:
9
  license: apache-2.0
10
  ---
11
  ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
12
- For Chinese natural language processing in specific domains, we provide **Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model)** for the medical domain named **pai-dkplm-bert-zh**.
13
 
14
- **[DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding](https://arxiv.org/abs/2112.01047)**
15
- Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang
16
-
17
- This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP ) developed by the Alibaba PAI team.
18
 
19
  ## Citation
20
  If you find the resource is useful, please cite the following papers in your work.
@@ -22,7 +19,8 @@ If you find the resource is useful, please cite the following papers in your wor
22
  - For the EasyNLP framework:
23
  ```
24
  @article{easynlp,
25
- title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing}, publisher = {arXiv},
 
26
  author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei},
27
  url = {https://arxiv.org/abs/2205.00258},
28
  year = {2022}
@@ -34,7 +32,7 @@ title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language P
34
  title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
35
  author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
36
  url = {https://arxiv.org/abs/2112.01047},
37
- publisher = {arXiv},
38
  year = {2021}
39
  }
40
  ```
 
9
  license: apache-2.0
10
  ---
11
  ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
12
+ For Chinese natural language processing in specific domains, we provide **Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model)** for the medical domain named **pai-dkplm-bert-zh**, from our AAAI 2021 paper named **DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding**.
13
 
14
+ This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP ) developed by the Alibaba PAI team. Please find the DKPLM tutorial here: [DKPLM Tutorial](https://github.com/alibaba/EasyNLP/tree/master/examples/dkplm_pretraining).
 
 
 
15
 
16
  ## Citation
17
  If you find the resource is useful, please cite the following papers in your work.
 
19
  - For the EasyNLP framework:
20
  ```
21
  @article{easynlp,
22
+ title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing},
23
+ publisher = {arXiv},
24
  author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei},
25
  url = {https://arxiv.org/abs/2205.00258},
26
  year = {2022}
 
32
  title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
33
  author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
34
  url = {https://arxiv.org/abs/2112.01047},
35
+ publisher = {AAAI},
36
  year = {2021}
37
  }
38
  ```