File size: 2,374 Bytes
cc99a14 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 |
---
tags:
- flair
- token-classification
- sequence-tagger-model
language: fa
dataset:
- NSURL-2019
widget:
- text: >-
کارنامه نشر، وابسته به موسسه خانه کتاب و زیر نظر احمد مسجدی جامعی معاون امور
فرهنگی وزارت فرهنگ و ارشاد اسلامی است.
metrics:
- f1
---
## Persian NER Using Flair
This is the 7-class Named-entity recognition model for Persian that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **90.33** (NSURL-2019)
Predicts NER tags:
| **tag** | **meaning** |
|:---------------------------------:|:-----------:|
| PER | person name |
| LOC | location name |
| ORG | organization name |
| DAT | date |
| TIM | time |
| PCT | percent |
| MON | Money|
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and Pars-Bert.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("PooryaPiroozfar/Flair_Persian_NER")
# make example sentence
sentence = Sentence("کارنامه نشر، وابسته به موسسه خانه کتاب و زیر نظر احمد مسجدی جامعی معاون امور فرهنگی وزارت فرهنگ و ارشاد اسلامی است.")
tagger.predict(sentence)
#print result
print(sentence.to_tagged_string())
```
This yields the following output:
```
```
---
### Results
- F-score (micro) 0.9033
- F-score (macro) 0.8976
- Accuracy 0.8277
```
By class:
precision recall f1-score support
ORG 0.9016 0.8667 0.8838 1523
LOC 0.9113 0.9305 0.9208 1425
PER 0.9216 0.9322 0.9269 1224
DAT 0.8623 0.7958 0.8277 480
MON 0.9665 0.9558 0.9611 181
PCT 0.9375 0.9740 0.9554 77
TIM 0.8235 0.7925 0.8077 53
micro avg 0.9081 0.8984 0.9033 4963
macro avg 0.9035 0.8925 0.8976 4963
weighted avg 0.9076 0.8984 0.9028 4963
samples avg 0.8277 0.8277 0.8277 4963
``` |