Fill-Mask
Transformers
PyTorch
Chinese
bert
Inference Endpoints
kirito commited on
Commit
cc8f917
·
2 Parent(s): ef941f9 58c8ef8

update READEME

Browse files
Files changed (1) hide show
  1. README.md +18 -11
README.md CHANGED
@@ -1,13 +1,21 @@
1
  ---
 
 
 
 
 
 
 
2
  license: apache-2.0
3
  ---
4
  ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
5
- For Chinese natural language processing in specific domains, we provide **Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain named pai-dkplm-bert-zh.**
6
 
7
  **[DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding](https://arxiv.org/abs/2112.01047)**
8
  Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang
9
 
10
- This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP ](https://github.com/alibaba/EasyNLP )developed by the Alibaba PAI team.
 
11
  ## Citation
12
  If you find the resource is useful, please cite the following papers in your work.
13
 
@@ -15,19 +23,18 @@ If you find the resource is useful, please cite the following papers in your wor
15
  ```
16
  @article{easynlp,
17
  title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing}, publisher = {arXiv},
18
- author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei},
19
- url = {https://arxiv.org/abs/2205.00258},
20
- year = {2022}
21
  }
22
  ```
23
-
24
  - For DKPLM:
25
  ```
26
  @article{dkplm,
27
- title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
28
- author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
29
- url = {https://arxiv.org/abs/2112.01047},
30
- publisher = {arXiv},
31
- year = {2021}
32
  }
33
  ```
 
1
  ---
2
+ language: zh
3
+ pipeline_tag: fill-mask
4
+ widget:
5
+ - text: "感冒需要吃[MASK]"
6
+ - text: "人类的[MASK]温是37度"
7
+ tags:
8
+ - bert
9
  license: apache-2.0
10
  ---
11
  ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
12
+ For Chinese natural language processing in specific domains, we provide **Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model)** for the medical domain named **pai-dkplm-bert-zh**.
13
 
14
  **[DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding](https://arxiv.org/abs/2112.01047)**
15
  Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang
16
 
17
+ This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP ) developed by the Alibaba PAI team.
18
+
19
  ## Citation
20
  If you find the resource is useful, please cite the following papers in your work.
21
 
 
23
  ```
24
  @article{easynlp,
25
  title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing}, publisher = {arXiv},
26
+ author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei},
27
+ url = {https://arxiv.org/abs/2205.00258},
28
+ year = {2022}
29
  }
30
  ```
 
31
  - For DKPLM:
32
  ```
33
  @article{dkplm,
34
+ title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
35
+ author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
36
+ url = {https://arxiv.org/abs/2112.01047},
37
+ publisher = {arXiv},
38
+ year = {2021}
39
  }
40
  ```