Hanlard commited on
Commit
84f1acd
1 Parent(s): 29ec7e2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -9
README.md CHANGED
@@ -3,15 +3,6 @@
3
  PanGu-伪 is proposed by a joint technical team headed by PCNL. It is the first large-scale Chinese pre-trained language model with 200 billion parameters trained on 2048 Ascend processors using an automatic hybrid parallel training strategy. The whole training process is done on the "Peng Cheng Cloud Brain II" computing platform with the domestic deep learning framework called MindSpore. The PengCheng路PanGu-伪 pre-training model can support rich applications, has strong few-shot learning capabilities, and has outstanding performance in text generation tasks such as knowledge question and answer, knowledge retrieval, knowledge reasoning, and reading comprehension.
4
 
5
  [[Technical report](https://git.openi.org.cn/PCL-Platform.Intelligence/PanGu-Alpha/src/branch/master/PANGU-%ce%b1.pdf)]
6
- [[Model download](#model-download)]
7
- [[Model compression](#model-compression)]
8
- [[Model application](#model-application)]
9
- [[GPU inference and finetune](#gpu-inference-and-finetune)]
10
- [[Corpus collection and processing](https://git.openi.org.cn/PCL-Platform.Intelligence/DataCollector/src/branch/master/README-en.md)]
11
- [[MindSpore official website](https://mindspore.cn/)]
12
- [[Join WeChat communication group](#wechat-group)]
13
- [[License](#license)]
14
-
15
 
16
 
17
  ### Key points
 
3
  PanGu-伪 is proposed by a joint technical team headed by PCNL. It is the first large-scale Chinese pre-trained language model with 200 billion parameters trained on 2048 Ascend processors using an automatic hybrid parallel training strategy. The whole training process is done on the "Peng Cheng Cloud Brain II" computing platform with the domestic deep learning framework called MindSpore. The PengCheng路PanGu-伪 pre-training model can support rich applications, has strong few-shot learning capabilities, and has outstanding performance in text generation tasks such as knowledge question and answer, knowledge retrieval, knowledge reasoning, and reading comprehension.
4
 
5
  [[Technical report](https://git.openi.org.cn/PCL-Platform.Intelligence/PanGu-Alpha/src/branch/master/PANGU-%ce%b1.pdf)]
 
 
 
 
 
 
 
 
 
6
 
7
 
8
  ### Key points