Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,12 @@ Apply for task sentiment analysis on using [AIViVN's comments dataset](https://w
|
|
8 |
The model achieved 0.90268 on the public leaderboard, (winner's score is 0.90087)
|
9 |
Bert4news is used for a toolkit Vietnames(segmentation and Named Entity Recognition) at ViNLPtoolkit(https://github.com/bino282/ViNLP)
|
10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
***************New Mar 11 , 2020 ***************
|
12 |
|
13 |
**[BERT](https://github.com/google-research/bert)** (from Google Research and the Toyota Technological Institute at Chicago) released with the paper [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805).
|
@@ -73,13 +79,6 @@ print(entities)
|
|
73 |
```
|
74 |
|
75 |
|
76 |
-
|
77 |
-
We use word sentencepiece, use basic bert tokenization and same config with bert base with lowercase = False.
|
78 |
-
|
79 |
-
You can download trained model:
|
80 |
-
- [tensorflow](https://drive.google.com/file/d/1X-sRDYf7moS_h61J3L79NkMVGHP-P-k5/view?usp=sharing).
|
81 |
-
- [pytorch](https://drive.google.com/file/d/11aFSTpYIurn-oI2XpAmcCTccB_AonMOu/view?usp=sharing).
|
82 |
-
|
83 |
Use with huggingface/transformers
|
84 |
``` bash
|
85 |
import torch
|
|
|
8 |
The model achieved 0.90268 on the public leaderboard, (winner's score is 0.90087)
|
9 |
Bert4news is used for a toolkit Vietnames(segmentation and Named Entity Recognition) at ViNLPtoolkit(https://github.com/bino282/ViNLP)
|
10 |
|
11 |
+
We use word sentencepiece, use basic bert tokenization and same config with bert base with lowercase = False.
|
12 |
+
|
13 |
+
You can download trained model:
|
14 |
+
- [tensorflow](https://drive.google.com/file/d/1X-sRDYf7moS_h61J3L79NkMVGHP-P-k5/view?usp=sharing).
|
15 |
+
- [pytorch](https://drive.google.com/file/d/11aFSTpYIurn-oI2XpAmcCTccB_AonMOu/view?usp=sharing).
|
16 |
+
|
17 |
***************New Mar 11 , 2020 ***************
|
18 |
|
19 |
**[BERT](https://github.com/google-research/bert)** (from Google Research and the Toyota Technological Institute at Chicago) released with the paper [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805).
|
|
|
79 |
```
|
80 |
|
81 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
82 |
Use with huggingface/transformers
|
83 |
``` bash
|
84 |
import torch
|