File size: 4,786 Bytes
e55fc7a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
---
language: en
license: mit
tags:
- flair
- token-classification
- sequence-tagger-model
base_model: dbmdz/bert-base-historic-multilingual-64k-td-cased
widget:
- text: On Wednesday , a public dinner was given by the Conservative Burgesses of
    Leads , to the Conservative members of the Leeds Town Council , in the Music Hall
    , Albion-street , which was very numerously attended .
---

# Fine-tuned Flair Model on TopRes19th English NER Dataset (HIPE-2022)

This Flair model was fine-tuned on the
[TopRes19th English](https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-topres19th.md)
NER Dataset using hmBERT 64k as backbone LM.

The TopRes19th dataset consists of NE-annotated historical English newspaper articles from 19C.

The following NEs were annotated: `BUILDING`, `LOC` and `STREET`.

# Results

We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:

* Batch Sizes: `[4, 8]`
* Learning Rates: `[3e-05, 5e-05]`

And report micro F1-score on development set:

| Configuration     | Seed 1       | Seed 2       | Seed 3       | Seed 4       | Seed 5          | Average         |
|-------------------|--------------|--------------|--------------|--------------|-----------------|-----------------|
| `bs8-e10-lr5e-05` | [0.7918][1]  | [0.7984][2]  | [0.786][3]   | [0.7841][4]  | [**0.7992**][5] | 0.7919 ± 0.0069 |
| `bs8-e10-lr3e-05` | [0.7886][6]  | [0.8142][7]  | [0.7925][8]  | [0.7865][9]  | [0.7757][10]    | 0.7915 ± 0.0141 |
| `bs4-e10-lr3e-05` | [0.7838][11] | [0.7885][12] | [0.7934][13] | [0.8049][14] | [0.7862][15]    | 0.7914 ± 0.0084 |
| `bs4-e10-lr5e-05` | [0.7621][16] | [0.7017][17] | [0.7578][18] | [0.7708][19] | [0.7686][20]    | 0.7522 ± 0.0287 |

[1]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[2]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[3]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[4]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[5]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[6]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[7]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[8]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[9]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[10]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
[11]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[12]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[13]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[14]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[15]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
[16]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[17]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[18]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[19]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[20]: https://hf.co/stefan-it/hmbench-topres19th-en-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5

The [training log](training.log) and TensorBoard logs (not available for hmBERT Base model) are also uploaded to the model hub.

More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).

# Acknowledgements

We thank [Luisa März](https://github.com/LuisaMaerz), [Katharina Schmid](https://github.com/schmika) and
[Erion Çano](https://github.com/erionc) for their fruitful discussions about Historic Language Models.

Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
Many Thanks for providing access to the TPUs ❤️