initial model commit
Browse files- README.md +189 -0
- loss.tsv +151 -0
- pytorch_model.bin +3 -0
- training.log +0 -0
README.md
ADDED
@@ -0,0 +1,189 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- flair
|
4 |
+
- token-classification
|
5 |
+
- sequence-tagger-model
|
6 |
+
language:
|
7 |
+
- en
|
8 |
+
- de
|
9 |
+
- fr
|
10 |
+
- it
|
11 |
+
- nl
|
12 |
+
- pl
|
13 |
+
- es
|
14 |
+
- sv
|
15 |
+
- da
|
16 |
+
- no
|
17 |
+
- fi
|
18 |
+
- cs
|
19 |
+
datasets:
|
20 |
+
- ontonotes
|
21 |
+
inference: false
|
22 |
+
---
|
23 |
+
|
24 |
+
## Multilingual Universal Part-of-Speech Tagging in Flair (fast model)
|
25 |
+
|
26 |
+
This is the fast multilingual universal part-of-speech tagging model that ships with [Flair](https://github.com/flairNLP/flair/).
|
27 |
+
|
28 |
+
F1-Score: **92,88** (12 UD Treebanks covering English, German, French, Italian, Dutch, Polish, Spanish, Swedish, Danish, Norwegian, Finnish and Czech)
|
29 |
+
|
30 |
+
Predicts universal POS tags:
|
31 |
+
|
32 |
+
| **tag** | **meaning** |
|
33 |
+
|---------------------------------|-----------|
|
34 |
+
|ADJ | adjective |
|
35 |
+
| ADP | adposition |
|
36 |
+
| ADV | adverb |
|
37 |
+
| AUX | auxiliary |
|
38 |
+
| CCONJ | coordinating conjunction |
|
39 |
+
| DET | determiner |
|
40 |
+
| INTJ | interjection |
|
41 |
+
| NOUN | noun |
|
42 |
+
| NUM | numeral |
|
43 |
+
| PART | particle |
|
44 |
+
| PRON | pronoun |
|
45 |
+
| PROPN | proper noun |
|
46 |
+
| PUNCT | punctuation |
|
47 |
+
| SCONJ | subordinating conjunction |
|
48 |
+
| SYM | symbol |
|
49 |
+
| VERB | verb |
|
50 |
+
| X | other |
|
51 |
+
|
52 |
+
|
53 |
+
|
54 |
+
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
|
55 |
+
|
56 |
+
---
|
57 |
+
|
58 |
+
### Demo: How to use in Flair
|
59 |
+
|
60 |
+
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
|
61 |
+
|
62 |
+
```python
|
63 |
+
from flair.data import Sentence
|
64 |
+
from flair.models import SequenceTagger
|
65 |
+
|
66 |
+
# load tagger
|
67 |
+
tagger = SequenceTagger.load("flair/upos-multi-fast")
|
68 |
+
|
69 |
+
# make example sentence
|
70 |
+
sentence = Sentence("Ich liebe Berlin, as they say. ")
|
71 |
+
|
72 |
+
# predict NER tags
|
73 |
+
tagger.predict(sentence)
|
74 |
+
|
75 |
+
# print sentence
|
76 |
+
print(sentence)
|
77 |
+
|
78 |
+
# print predicted NER spans
|
79 |
+
print('The following NER tags are found:')
|
80 |
+
# iterate over entities and print
|
81 |
+
for entity in sentence.get_spans('pos'):
|
82 |
+
print(entity)
|
83 |
+
```
|
84 |
+
|
85 |
+
This yields the following output:
|
86 |
+
```
|
87 |
+
Span [1]: "Ich" [− Labels: PRON (0.9999)]
|
88 |
+
Span [2]: "liebe" [− Labels: VERB (0.9999)]
|
89 |
+
Span [3]: "Berlin" [− Labels: PROPN (0.9997)]
|
90 |
+
Span [4]: "," [− Labels: PUNCT (1.0)]
|
91 |
+
Span [5]: "as" [− Labels: SCONJ (0.9991)]
|
92 |
+
Span [6]: "they" [− Labels: PRON (0.9998)]
|
93 |
+
Span [7]: "say" [− Labels: VERB (0.9998)]
|
94 |
+
Span [8]: "." [− Labels: PUNCT (1.0)]
|
95 |
+
```
|
96 |
+
|
97 |
+
So, the words "*Ich*" and "*they*" are labeled as **pronouns** (PRON), while "*liebe*" and "*say*" are labeled as **verbs** (VERB) in the multilingual sentence "*Ich liebe Berlin, as they say*".
|
98 |
+
|
99 |
+
|
100 |
+
---
|
101 |
+
|
102 |
+
### Training: Script to train this model
|
103 |
+
|
104 |
+
The following Flair script was used to train this model:
|
105 |
+
|
106 |
+
```python
|
107 |
+
from flair.data import MultiCorpus
|
108 |
+
from flair.datasets import UD_ENGLISH, UD_GERMAN, UD_FRENCH, UD_ITALIAN, UD_POLISH, UD_DUTCH, UD_CZECH, \
|
109 |
+
UD_DANISH, UD_SPANISH, UD_SWEDISH, UD_NORWEGIAN, UD_FINNISH
|
110 |
+
from flair.embeddings import StackedEmbeddings, FlairEmbeddings
|
111 |
+
|
112 |
+
# 1. make a multi corpus consisting of 12 UD treebanks (in_memory=False here because this corpus becomes large)
|
113 |
+
corpus = MultiCorpus([
|
114 |
+
UD_ENGLISH(in_memory=False),
|
115 |
+
UD_GERMAN(in_memory=False),
|
116 |
+
UD_DUTCH(in_memory=False),
|
117 |
+
UD_FRENCH(in_memory=False),
|
118 |
+
UD_ITALIAN(in_memory=False),
|
119 |
+
UD_SPANISH(in_memory=False),
|
120 |
+
UD_POLISH(in_memory=False),
|
121 |
+
UD_CZECH(in_memory=False),
|
122 |
+
UD_DANISH(in_memory=False),
|
123 |
+
UD_SWEDISH(in_memory=False),
|
124 |
+
UD_NORWEGIAN(in_memory=False),
|
125 |
+
UD_FINNISH(in_memory=False),
|
126 |
+
])
|
127 |
+
|
128 |
+
# 2. what tag do we want to predict?
|
129 |
+
tag_type = 'upos'
|
130 |
+
|
131 |
+
# 3. make the tag dictionary from the corpus
|
132 |
+
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
|
133 |
+
|
134 |
+
# 4. initialize each embedding we use
|
135 |
+
embedding_types = [
|
136 |
+
|
137 |
+
# contextual string embeddings, forward
|
138 |
+
FlairEmbeddings('multi-forward-fast'),
|
139 |
+
|
140 |
+
# contextual string embeddings, backward
|
141 |
+
FlairEmbeddings('multi-backward-fast'),
|
142 |
+
]
|
143 |
+
|
144 |
+
# embedding stack consists of Flair and GloVe embeddings
|
145 |
+
embeddings = StackedEmbeddings(embeddings=embedding_types)
|
146 |
+
|
147 |
+
# 5. initialize sequence tagger
|
148 |
+
from flair.models import SequenceTagger
|
149 |
+
|
150 |
+
tagger = SequenceTagger(hidden_size=256,
|
151 |
+
embeddings=embeddings,
|
152 |
+
tag_dictionary=tag_dictionary,
|
153 |
+
tag_type=tag_type,
|
154 |
+
use_crf=False)
|
155 |
+
|
156 |
+
# 6. initialize trainer
|
157 |
+
from flair.trainers import ModelTrainer
|
158 |
+
|
159 |
+
trainer = ModelTrainer(tagger, corpus)
|
160 |
+
|
161 |
+
# 7. run training
|
162 |
+
trainer.train('resources/taggers/upos-multi-fast',
|
163 |
+
train_with_dev=True,
|
164 |
+
max_epochs=150)
|
165 |
+
```
|
166 |
+
|
167 |
+
|
168 |
+
|
169 |
+
---
|
170 |
+
|
171 |
+
### Cite
|
172 |
+
|
173 |
+
Please cite the following paper when using this model.
|
174 |
+
|
175 |
+
```
|
176 |
+
@inproceedings{akbik2018coling,
|
177 |
+
title={Contextual String Embeddings for Sequence Labeling},
|
178 |
+
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
|
179 |
+
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
|
180 |
+
pages = {1638--1649},
|
181 |
+
year = {2018}
|
182 |
+
}
|
183 |
+
```
|
184 |
+
|
185 |
+
---
|
186 |
+
|
187 |
+
### Issues?
|
188 |
+
|
189 |
+
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
|
loss.tsv
ADDED
@@ -0,0 +1,151 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS TRAIN_PRECISION TRAIN_RECALL TRAIN_ACCURACY TRAIN_F-SCORE DEV_LOSS DEV_PRECISION DEV_RECALL DEV_ACCURACY DEV_F-SCORE TEST_LOSS TEST_PRECISION TEST_RECALL TEST_ACCURACY TEST_F-SCORE
|
2 |
+
0 17:20:53 0 0.1000 0.9190797222630016 _ _ _ _ _ _ _ _ _ _ 0.777 0.777 0.777 0.777
|
3 |
+
1 17:24:38 0 0.1000 0.6716432737287085 _ _ _ _ _ _ _ _ _ _ 0.8233 0.8233 0.8233 0.8233
|
4 |
+
2 17:28:26 0 0.1000 0.5960667036171194 _ _ _ _ _ _ _ _ _ _ 0.8401 0.8401 0.8401 0.8401
|
5 |
+
3 17:32:07 0 0.1000 0.5517474701099455 _ _ _ _ _ _ _ _ _ _ 0.8553 0.8553 0.8553 0.8553
|
6 |
+
4 17:35:50 0 0.1000 0.5247127810490859 _ _ _ _ _ _ _ _ _ _ 0.8654 0.8654 0.8654 0.8654
|
7 |
+
5 17:39:48 0 0.1000 0.5020367544745151 _ _ _ _ _ _ _ _ _ _ 0.8735 0.8735 0.8735 0.8735
|
8 |
+
6 17:43:32 0 0.1000 0.48632956523237236 _ _ _ _ _ _ _ _ _ _ 0.8774 0.8774 0.8774 0.8774
|
9 |
+
7 17:47:14 0 0.1000 0.47448061674651854 _ _ _ _ _ _ _ _ _ _ 0.8797 0.8797 0.8797 0.8797
|
10 |
+
8 17:50:58 0 0.1000 0.4657747268559462 _ _ _ _ _ _ _ _ _ _ 0.8845 0.8845 0.8845 0.8845
|
11 |
+
9 17:54:40 0 0.1000 0.4548023839797725 _ _ _ _ _ _ _ _ _ _ 0.8892 0.8892 0.8892 0.8892
|
12 |
+
10 17:58:22 0 0.1000 0.4479015595216538 _ _ _ _ _ _ _ _ _ _ 0.888 0.888 0.888 0.888
|
13 |
+
11 18:02:05 0 0.1000 0.4417490141883859 _ _ _ _ _ _ _ _ _ _ 0.8913 0.8913 0.8913 0.8913
|
14 |
+
12 18:05:47 0 0.1000 0.4353397585937434 _ _ _ _ _ _ _ _ _ _ 0.8957 0.8957 0.8957 0.8957
|
15 |
+
13 18:09:45 0 0.1000 0.4300186406485238 _ _ _ _ _ _ _ _ _ _ 0.8971 0.8971 0.8971 0.8971
|
16 |
+
14 18:13:33 0 0.1000 0.42514593633532055 _ _ _ _ _ _ _ _ _ _ 0.8986 0.8986 0.8986 0.8986
|
17 |
+
15 18:17:16 0 0.1000 0.4217662339969708 _ _ _ _ _ _ _ _ _ _ 0.899 0.899 0.899 0.899
|
18 |
+
16 18:21:19 0 0.1000 0.4185326483417642 _ _ _ _ _ _ _ _ _ _ 0.8996 0.8996 0.8996 0.8996
|
19 |
+
17 18:25:02 0 0.1000 0.41529573476239035 _ _ _ _ _ _ _ _ _ _ 0.9021 0.9021 0.9021 0.9021
|
20 |
+
18 18:28:44 0 0.1000 0.41164514654259365 _ _ _ _ _ _ _ _ _ _ 0.9014 0.9014 0.9014 0.9014
|
21 |
+
19 18:32:27 0 0.1000 0.40819807794557744 _ _ _ _ _ _ _ _ _ _ 0.9026 0.9026 0.9026 0.9026
|
22 |
+
20 18:36:09 0 0.1000 0.4072363424415511 _ _ _ _ _ _ _ _ _ _ 0.904 0.904 0.904 0.904
|
23 |
+
21 18:39:47 0 0.1000 0.40504910720195475 _ _ _ _ _ _ _ _ _ _ 0.905 0.905 0.905 0.905
|
24 |
+
22 18:43:34 0 0.1000 0.40202132739125 _ _ _ _ _ _ _ _ _ _ 0.9047 0.9047 0.9047 0.9047
|
25 |
+
23 18:47:11 0 0.1000 0.4016447351458934 _ _ _ _ _ _ _ _ _ _ 0.9048 0.9048 0.9048 0.9048
|
26 |
+
24 18:50:53 0 0.1000 0.3990221622819437 _ _ _ _ _ _ _ _ _ _ 0.9057 0.9057 0.9057 0.9057
|
27 |
+
25 18:54:29 0 0.1000 0.39748555323960455 _ _ _ _ _ _ _ _ _ _ 0.9066 0.9066 0.9066 0.9066
|
28 |
+
26 18:58:06 0 0.1000 0.39510030264373813 _ _ _ _ _ _ _ _ _ _ 0.9074 0.9074 0.9074 0.9074
|
29 |
+
27 19:01:41 0 0.1000 0.3918911876917555 _ _ _ _ _ _ _ _ _ _ 0.9092 0.9092 0.9092 0.9092
|
30 |
+
28 19:05:16 0 0.1000 0.3908691110859825 _ _ _ _ _ _ _ _ _ _ 0.909 0.909 0.909 0.909
|
31 |
+
29 19:08:51 0 0.1000 0.39194263081204284 _ _ _ _ _ _ _ _ _ _ 0.9083 0.9083 0.9083 0.9083
|
32 |
+
30 19:12:54 1 0.1000 0.390020066672355 _ _ _ _ _ _ _ _ _ _ 0.9101 0.9101 0.9101 0.9101
|
33 |
+
31 19:16:30 0 0.1000 0.3874159374133058 _ _ _ _ _ _ _ _ _ _ 0.9101 0.9101 0.9101 0.9101
|
34 |
+
32 19:20:07 0 0.1000 0.38671515579703375 _ _ _ _ _ _ _ _ _ _ 0.9102 0.9102 0.9102 0.9102
|
35 |
+
33 19:23:42 0 0.1000 0.38522100101608026 _ _ _ _ _ _ _ _ _ _ 0.91 0.91 0.91 0.91
|
36 |
+
34 19:27:18 0 0.1000 0.38452733974479464 _ _ _ _ _ _ _ _ _ _ 0.9115 0.9115 0.9115 0.9115
|
37 |
+
35 19:30:53 0 0.1000 0.38427426408016185 _ _ _ _ _ _ _ _ _ _ 0.9109 0.9109 0.9109 0.9109
|
38 |
+
36 19:34:29 0 0.1000 0.38305842953409763 _ _ _ _ _ _ _ _ _ _ 0.9124 0.9124 0.9124 0.9124
|
39 |
+
37 19:38:04 0 0.1000 0.38233189722553446 _ _ _ _ _ _ _ _ _ _ 0.9114 0.9114 0.9114 0.9114
|
40 |
+
38 19:41:47 0 0.1000 0.38083049013437587 _ _ _ _ _ _ _ _ _ _ 0.9125 0.9125 0.9125 0.9125
|
41 |
+
39 19:45:25 0 0.1000 0.38015748730049487 _ _ _ _ _ _ _ _ _ _ 0.9127 0.9127 0.9127 0.9127
|
42 |
+
40 19:49:03 0 0.1000 0.37948072639638775 _ _ _ _ _ _ _ _ _ _ 0.9123 0.9123 0.9123 0.9123
|
43 |
+
41 19:52:39 0 0.1000 0.3794758443748531 _ _ _ _ _ _ _ _ _ _ 0.9126 0.9126 0.9126 0.9126
|
44 |
+
42 19:56:13 1 0.1000 0.377902250212404 _ _ _ _ _ _ _ _ _ _ 0.9135 0.9135 0.9135 0.9135
|
45 |
+
43 19:59:47 0 0.1000 0.3766043377062385 _ _ _ _ _ _ _ _ _ _ 0.9133 0.9133 0.9133 0.9133
|
46 |
+
44 20:03:19 0 0.1000 0.37573553599806514 _ _ _ _ _ _ _ _ _ _ 0.9137 0.9137 0.9137 0.9137
|
47 |
+
45 20:06:50 0 0.1000 0.3748471322266952 _ _ _ _ _ _ _ _ _ _ 0.9135 0.9135 0.9135 0.9135
|
48 |
+
46 20:10:19 0 0.1000 0.3744547664734294 _ _ _ _ _ _ _ _ _ _ 0.9132 0.9132 0.9132 0.9132
|
49 |
+
47 20:13:52 0 0.1000 0.37502166761711847 _ _ _ _ _ _ _ _ _ _ 0.9143 0.9143 0.9143 0.9143
|
50 |
+
48 20:17:41 1 0.1000 0.3721831904085566 _ _ _ _ _ _ _ _ _ _ 0.9145 0.9145 0.9145 0.9145
|
51 |
+
49 20:21:08 0 0.1000 0.3714427081547894 _ _ _ _ _ _ _ _ _ _ 0.9148 0.9148 0.9148 0.9148
|
52 |
+
50 20:24:37 0 0.1000 0.37246603286469393 _ _ _ _ _ _ _ _ _ _ 0.9145 0.9145 0.9145 0.9145
|
53 |
+
51 20:28:05 1 0.1000 0.3704952007998493 _ _ _ _ _ _ _ _ _ _ 0.9144 0.9144 0.9144 0.9144
|
54 |
+
52 20:31:34 0 0.1000 0.3703908653195547 _ _ _ _ _ _ _ _ _ _ 0.9143 0.9143 0.9143 0.9143
|
55 |
+
53 20:35:03 0 0.1000 0.37107703269995407 _ _ _ _ _ _ _ _ _ _ 0.916 0.916 0.916 0.916
|
56 |
+
54 20:38:32 1 0.1000 0.37049108522393775 _ _ _ _ _ _ _ _ _ _ 0.9153 0.9153 0.9153 0.9153
|
57 |
+
55 20:42:02 2 0.1000 0.37020046933146 _ _ _ _ _ _ _ _ _ _ 0.9152 0.9152 0.9152 0.9152
|
58 |
+
56 20:45:33 0 0.1000 0.36912775061910197 _ _ _ _ _ _ _ _ _ _ 0.9159 0.9159 0.9159 0.9159
|
59 |
+
57 20:49:02 0 0.1000 0.3671512951590983 _ _ _ _ _ _ _ _ _ _ 0.9151 0.9151 0.9151 0.9151
|
60 |
+
58 20:52:30 0 0.1000 0.3668931041249558 _ _ _ _ _ _ _ _ _ _ 0.9154 0.9154 0.9154 0.9154
|
61 |
+
59 20:55:59 0 0.1000 0.36706147675945966 _ _ _ _ _ _ _ _ _ _ 0.9172 0.9172 0.9172 0.9172
|
62 |
+
60 20:59:28 1 0.1000 0.3657434461712214 _ _ _ _ _ _ _ _ _ _ 0.9168 0.9168 0.9168 0.9168
|
63 |
+
61 21:02:58 0 0.1000 0.3659182143304713 _ _ _ _ _ _ _ _ _ _ 0.916 0.916 0.916 0.916
|
64 |
+
62 21:06:27 1 0.1000 0.3662726061567449 _ _ _ _ _ _ _ _ _ _ 0.9156 0.9156 0.9156 0.9156
|
65 |
+
63 21:09:56 2 0.1000 0.3638134179826793 _ _ _ _ _ _ _ _ _ _ 0.9163 0.9163 0.9163 0.9163
|
66 |
+
64 21:13:27 0 0.1000 0.3639648409197477 _ _ _ _ _ _ _ _ _ _ 0.9176 0.9176 0.9176 0.9176
|
67 |
+
65 21:16:56 1 0.1000 0.3648721676257142 _ _ _ _ _ _ _ _ _ _ 0.9168 0.9168 0.9168 0.9168
|
68 |
+
66 21:20:25 2 0.1000 0.3633504617117189 _ _ _ _ _ _ _ _ _ _ 0.917 0.917 0.917 0.917
|
69 |
+
67 21:23:53 0 0.1000 0.3645101590239933 _ _ _ _ _ _ _ _ _ _ 0.9171 0.9171 0.9171 0.9171
|
70 |
+
68 21:27:23 1 0.1000 0.3630188812231188 _ _ _ _ _ _ _ _ _ _ 0.916 0.916 0.916 0.916
|
71 |
+
69 21:30:52 0 0.1000 0.3636482348753987 _ _ _ _ _ _ _ _ _ _ 0.9167 0.9167 0.9167 0.9167
|
72 |
+
70 21:34:20 1 0.1000 0.3639552533385855 _ _ _ _ _ _ _ _ _ _ 0.9173 0.9173 0.9173 0.9173
|
73 |
+
71 21:37:48 2 0.1000 0.3624460829547383 _ _ _ _ _ _ _ _ _ _ 0.9162 0.9162 0.9162 0.9162
|
74 |
+
72 21:41:17 0 0.1000 0.36187481353822637 _ _ _ _ _ _ _ _ _ _ 0.9167 0.9167 0.9167 0.9167
|
75 |
+
73 21:44:48 0 0.1000 0.3638685102028578 _ _ _ _ _ _ _ _ _ _ 0.9178 0.9178 0.9178 0.9178
|
76 |
+
74 21:48:20 1 0.1000 0.3608434076808494 _ _ _ _ _ _ _ _ _ _ 0.9167 0.9167 0.9167 0.9167
|
77 |
+
75 21:51:49 0 0.1000 0.3602308806752747 _ _ _ _ _ _ _ _ _ _ 0.9172 0.9172 0.9172 0.9172
|
78 |
+
76 21:55:23 0 0.1000 0.3598624867144525 _ _ _ _ _ _ _ _ _ _ 0.9182 0.9182 0.9182 0.9182
|
79 |
+
77 21:58:51 0 0.1000 0.36065239670790644 _ _ _ _ _ _ _ _ _ _ 0.9181 0.9181 0.9181 0.9181
|
80 |
+
78 22:02:19 1 0.1000 0.360412118996118 _ _ _ _ _ _ _ _ _ _ 0.9175 0.9175 0.9175 0.9175
|
81 |
+
79 22:05:47 2 0.1000 0.3591554729788417 _ _ _ _ _ _ _ _ _ _ 0.9181 0.9181 0.9181 0.9181
|
82 |
+
80 22:09:15 0 0.1000 0.358690284100161 _ _ _ _ _ _ _ _ _ _ 0.9186 0.9186 0.9186 0.9186
|
83 |
+
81 22:13:04 0 0.1000 0.35836367092432064 _ _ _ _ _ _ _ _ _ _ 0.9179 0.9179 0.9179 0.9179
|
84 |
+
82 22:16:35 0 0.1000 0.3590073234306779 _ _ _ _ _ _ _ _ _ _ 0.9173 0.9173 0.9173 0.9173
|
85 |
+
83 22:20:03 1 0.1000 0.35931719463767836 _ _ _ _ _ _ _ _ _ _ 0.9173 0.9173 0.9173 0.9173
|
86 |
+
84 22:23:32 2 0.1000 0.35991999209495307 _ _ _ _ _ _ _ _ _ _ 0.9167 0.9167 0.9167 0.9167
|
87 |
+
85 22:27:00 3 0.1000 0.3583576700660675 _ _ _ _ _ _ _ _ _ _ 0.9182 0.9182 0.9182 0.9182
|
88 |
+
86 22:30:28 0 0.0500 0.3383489196435348 _ _ _ _ _ _ _ _ _ _ 0.9214 0.9214 0.9214 0.9214
|
89 |
+
87 22:33:56 0 0.0500 0.3332596721458173 _ _ _ _ _ _ _ _ _ _ 0.922 0.922 0.922 0.922
|
90 |
+
88 22:37:27 0 0.0500 0.3287687047201385 _ _ _ _ _ _ _ _ _ _ 0.9229 0.9229 0.9229 0.9229
|
91 |
+
89 22:40:56 0 0.0500 0.3285579723036529 _ _ _ _ _ _ _ _ _ _ 0.9225 0.9225 0.9225 0.9225
|
92 |
+
90 22:44:25 0 0.0500 0.3249955790471447 _ _ _ _ _ _ _ _ _ _ 0.923 0.923 0.923 0.923
|
93 |
+
91 22:47:56 0 0.0500 0.32664729893809913 _ _ _ _ _ _ _ _ _ _ 0.9227 0.9227 0.9227 0.9227
|
94 |
+
92 22:51:31 1 0.0500 0.32461294523941386 _ _ _ _ _ _ _ _ _ _ 0.9229 0.9229 0.9229 0.9229
|
95 |
+
93 22:55:00 0 0.0500 0.32429037422202617 _ _ _ _ _ _ _ _ _ _ 0.9236 0.9236 0.9236 0.9236
|
96 |
+
94 22:58:32 0 0.0500 0.3235636986328555 _ _ _ _ _ _ _ _ _ _ 0.923 0.923 0.923 0.923
|
97 |
+
95 23:02:00 0 0.0500 0.3222244535159027 _ _ _ _ _ _ _ _ _ _ 0.9238 0.9238 0.9238 0.9238
|
98 |
+
96 23:05:29 0 0.0500 0.32155113324754814 _ _ _ _ _ _ _ _ _ _ 0.9235 0.9235 0.9235 0.9235
|
99 |
+
97 23:09:01 0 0.0500 0.321625706469336 _ _ _ _ _ _ _ _ _ _ 0.924 0.924 0.924 0.924
|
100 |
+
98 23:12:30 1 0.0500 0.32150654914675636 _ _ _ _ _ _ _ _ _ _ 0.9236 0.9236 0.9236 0.9236
|
101 |
+
99 23:16:06 0 0.0500 0.32067421640541527 _ _ _ _ _ _ _ _ _ _ 0.9238 0.9238 0.9238 0.9238
|
102 |
+
100 23:19:38 0 0.0500 0.3206731891157032 _ _ _ _ _ _ _ _ _ _ 0.9246 0.9246 0.9246 0.9246
|
103 |
+
101 23:23:06 1 0.0500 0.319841820813021 _ _ _ _ _ _ _ _ _ _ 0.9242 0.9242 0.9242 0.9242
|
104 |
+
102 23:26:33 0 0.0500 0.3191204305807779 _ _ _ _ _ _ _ _ _ _ 0.9239 0.9239 0.9239 0.9239
|
105 |
+
103 23:30:01 0 0.0500 0.3174570998350242 _ _ _ _ _ _ _ _ _ _ 0.9241 0.9241 0.9241 0.9241
|
106 |
+
104 23:33:29 0 0.0500 0.3191340444096926 _ _ _ _ _ _ _ _ _ _ 0.9243 0.9243 0.9243 0.9243
|
107 |
+
105 23:36:57 1 0.0500 0.31770479497054654 _ _ _ _ _ _ _ _ _ _ 0.9241 0.9241 0.9241 0.9241
|
108 |
+
106 23:40:26 2 0.0500 0.3172146900209962 _ _ _ _ _ _ _ _ _ _ 0.9239 0.9239 0.9239 0.9239
|
109 |
+
107 23:43:54 0 0.0500 0.3160222761159336 _ _ _ _ _ _ _ _ _ _ 0.924 0.924 0.924 0.924
|
110 |
+
108 23:47:24 0 0.0500 0.31664770050950836 _ _ _ _ _ _ _ _ _ _ 0.9244 0.9244 0.9244 0.9244
|
111 |
+
109 23:50:54 1 0.0500 0.3151855416235648 _ _ _ _ _ _ _ _ _ _ 0.9243 0.9243 0.9243 0.9243
|
112 |
+
110 23:54:22 0 0.0500 0.3156811461111346 _ _ _ _ _ _ _ _ _ _ 0.924 0.924 0.924 0.924
|
113 |
+
111 23:57:54 1 0.0500 0.3161979560382085 _ _ _ _ _ _ _ _ _ _ 0.9246 0.9246 0.9246 0.9246
|
114 |
+
112 00:01:23 2 0.0500 0.31593882176664106 _ _ _ _ _ _ _ _ _ _ 0.925 0.925 0.925 0.925
|
115 |
+
113 00:05:18 3 0.0500 0.3152490947733476 _ _ _ _ _ _ _ _ _ _ 0.9246 0.9246 0.9246 0.9246
|
116 |
+
114 00:08:47 0 0.0250 0.3063757599452114 _ _ _ _ _ _ _ _ _ _ 0.926 0.926 0.926 0.926
|
117 |
+
115 00:12:17 0 0.0250 0.3037663424432667 _ _ _ _ _ _ _ _ _ _ 0.9264 0.9264 0.9264 0.9264
|
118 |
+
116 00:15:50 0 0.0250 0.3026768150561347 _ _ _ _ _ _ _ _ _ _ 0.9262 0.9262 0.9262 0.9262
|
119 |
+
117 00:19:23 0 0.0250 0.301911720552316 _ _ _ _ _ _ _ _ _ _ 0.9262 0.9262 0.9262 0.9262
|
120 |
+
118 00:22:54 0 0.0250 0.30135701154769623 _ _ _ _ _ _ _ _ _ _ 0.9265 0.9265 0.9265 0.9265
|
121 |
+
119 00:26:28 0 0.0250 0.30062567218434066 _ _ _ _ _ _ _ _ _ _ 0.9267 0.9267 0.9267 0.9267
|
122 |
+
120 00:30:04 0 0.0250 0.30065195452518656 _ _ _ _ _ _ _ _ _ _ 0.9267 0.9267 0.9267 0.9267
|
123 |
+
121 00:33:34 1 0.0250 0.29976059630590474 _ _ _ _ _ _ _ _ _ _ 0.9268 0.9268 0.9268 0.9268
|
124 |
+
122 00:37:02 0 0.0250 0.2985034517248484 _ _ _ _ _ _ _ _ _ _ 0.9269 0.9269 0.9269 0.9269
|
125 |
+
123 00:40:31 0 0.0250 0.2982299711554555 _ _ _ _ _ _ _ _ _ _ 0.9272 0.9272 0.9272 0.9272
|
126 |
+
124 00:44:00 0 0.0250 0.2981091206836624 _ _ _ _ _ _ _ _ _ _ 0.927 0.927 0.927 0.927
|
127 |
+
125 00:47:28 0 0.0250 0.2959095034530295 _ _ _ _ _ _ _ _ _ _ 0.9271 0.9271 0.9271 0.9271
|
128 |
+
126 00:51:01 0 0.0250 0.29701172573023016 _ _ _ _ _ _ _ _ _ _ 0.9268 0.9268 0.9268 0.9268
|
129 |
+
127 00:54:29 1 0.0250 0.2971695579327705 _ _ _ _ _ _ _ _ _ _ 0.9272 0.9272 0.9272 0.9272
|
130 |
+
128 00:57:58 2 0.0250 0.2969908203507443 _ _ _ _ _ _ _ _ _ _ 0.9272 0.9272 0.9272 0.9272
|
131 |
+
129 01:01:27 3 0.0250 0.2980158057176761 _ _ _ _ _ _ _ _ _ _ 0.9273 0.9273 0.9273 0.9273
|
132 |
+
130 01:04:56 0 0.0125 0.2920450986249077 _ _ _ _ _ _ _ _ _ _ 0.9273 0.9273 0.9273 0.9273
|
133 |
+
131 01:08:31 0 0.0125 0.29132962870162377 _ _ _ _ _ _ _ _ _ _ 0.9278 0.9278 0.9278 0.9278
|
134 |
+
132 01:11:59 0 0.0125 0.29008412956788265 _ _ _ _ _ _ _ _ _ _ 0.9277 0.9277 0.9277 0.9277
|
135 |
+
133 01:15:27 0 0.0125 0.29002047444153295 _ _ _ _ _ _ _ _ _ _ 0.9279 0.9279 0.9279 0.9279
|
136 |
+
134 01:18:56 0 0.0125 0.2895771512047742 _ _ _ _ _ _ _ _ _ _ 0.928 0.928 0.928 0.928
|
137 |
+
135 01:22:29 0 0.0125 0.2890135376831141 _ _ _ _ _ _ _ _ _ _ 0.9281 0.9281 0.9281 0.9281
|
138 |
+
136 01:25:58 0 0.0125 0.28820534214831617 _ _ _ _ _ _ _ _ _ _ 0.928 0.928 0.928 0.928
|
139 |
+
137 01:29:27 0 0.0125 0.2886223816855515 _ _ _ _ _ _ _ _ _ _ 0.9282 0.9282 0.9282 0.9282
|
140 |
+
138 01:32:55 1 0.0125 0.28920243310231414 _ _ _ _ _ _ _ _ _ _ 0.9281 0.9281 0.9281 0.9281
|
141 |
+
139 01:36:23 2 0.0125 0.28792437217338346 _ _ _ _ _ _ _ _ _ _ 0.9279 0.9279 0.9279 0.9279
|
142 |
+
140 01:39:52 0 0.0125 0.28707431522855614 _ _ _ _ _ _ _ _ _ _ 0.9281 0.9281 0.9281 0.9281
|
143 |
+
141 01:43:20 0 0.0125 0.28832379478522974 _ _ _ _ _ _ _ _ _ _ 0.9283 0.9283 0.9283 0.9283
|
144 |
+
142 01:46:48 1 0.0125 0.28975775159598055 _ _ _ _ _ _ _ _ _ _ 0.9282 0.9282 0.9282 0.9282
|
145 |
+
143 01:50:16 2 0.0125 0.2880531497794666 _ _ _ _ _ _ _ _ _ _ 0.928 0.928 0.928 0.928
|
146 |
+
144 01:53:49 3 0.0125 0.287825567466029 _ _ _ _ _ _ _ _ _ _ 0.9285 0.9285 0.9285 0.9285
|
147 |
+
145 01:57:18 0 0.0063 0.28631645248738896 _ _ _ _ _ _ _ _ _ _ 0.9286 0.9286 0.9286 0.9286
|
148 |
+
146 02:01:13 0 0.0063 0.28486480532279373 _ _ _ _ _ _ _ _ _ _ 0.9286 0.9286 0.9286 0.9286
|
149 |
+
147 02:04:41 0 0.0063 0.2841264129166399 _ _ _ _ _ _ _ _ _ _ 0.9287 0.9287 0.9287 0.9287
|
150 |
+
148 02:08:10 0 0.0063 0.28465119782251286 _ _ _ _ _ _ _ _ _ _ 0.9288 0.9288 0.9288 0.9288
|
151 |
+
149 02:11:38 1 0.0063 0.28372669010548646 _ _ _ _ _ _ _ _ _ _ 0.9288 0.9288 0.9288 0.9288
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eb8d2732935a24eb64a73891e8ae3f12b37a5cf8ca1e3d8d22059e21a32ec912
|
3 |
+
size 72105507
|
training.log
ADDED
The diff for this file is too large to render.
See raw diff
|
|