File size: 4,917 Bytes
8f84db2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
---
language: de
license: mit
tags:
- flair
- token-classification
- sequence-tagger-model
base_model: dbmdz/bert-tiny-historic-multilingual-cased
widget:
- text: Es war am 25sten , als Lord Corn wollis Dublin mit seinem Gefolge und mehrern
    Truppen verließ , um in einer Central - Lage bey Sligo die Operationen der Armee
    persönlich zu dirigiren . Der Feind dürfte bald in die Enge kommen , da Gen .
    Lacke mit 6000 Mann ihm entgegen marschirt .
---

# Fine-tuned Flair Model on German HIPE-2020 Dataset (HIPE-2022)

This Flair model was fine-tuned on the
[German HIPE-2020](https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-hipe2020.md)
NER Dataset using hmBERT Tiny as backbone LM.

The HIPE-2020 dataset is comprised of newspapers from mid 19C to mid 20C. For information can be found
[here](https://dl.acm.org/doi/abs/10.1007/978-3-030-58219-7_21).

The following NEs were annotated: `loc`, `org`, `pers`, `prod`, `time` and `comp`.

# Results

We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:

* Batch Sizes: `[4, 8]`
* Learning Rates: `[5e-05, 3e-05]`

And report micro F1-score on development set:

| Configuration     | Seed 1       | Seed 2          | Seed 3       | Seed 4       | Seed 5       | Average         |
|-------------------|--------------|-----------------|--------------|--------------|--------------|-----------------|
| `bs4-e10-lr5e-05` | [0.3947][1]  | [0.398][2]      | [0.3655][3]  | [0.3703][4]  | [0.3858][5]  | 0.3829 ± 0.0145 |
| `bs8-e10-lr5e-05` | [0.3744][6]  | [0.3819][7]     | [0.3486][8]  | [0.3506][9]  | [0.3645][10] | 0.364 ± 0.0145  |
| `bs4-e10-lr3e-05` | [0.3415][11] | [0.3579][12]    | [0.3291][13] | [0.3351][14] | [0.3549][15] | 0.3437 ± 0.0124 |
| `bs8-e10-lr3e-05` | [0.3282][16] | [**0.336**][17] | [0.3138][18] | [0.3172][19] | [0.3422][20] | 0.3275 ± 0.0121 |

[1]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[2]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[3]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[4]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[5]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[6]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[7]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[8]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[9]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[10]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[11]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[12]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[13]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[14]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[15]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
[16]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[17]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[18]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[19]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[20]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5

The [training log](training.log) and TensorBoard logs (not available for hmBERT Base model) are also uploaded to the model hub.

More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).

# Acknowledgements

We thank [Luisa März](https://github.com/LuisaMaerz), [Katharina Schmid](https://github.com/schmika) and
[Erion Çano](https://github.com/erionc) for their fruitful discussions about Historic Language Models.

Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
Many Thanks for providing access to the TPUs ❤️