|
--- |
|
language: nl |
|
license: mit |
|
tags: |
|
- flair |
|
- token-classification |
|
- sequence-tagger-model |
|
base_model: dbmdz/bert-tiny-historic-multilingual-cased |
|
widget: |
|
- text: Professoren der Geneeskun dige Faculteit te Groningen alsook van de HH , Doctoren |
|
en Chirurgijns van Groningen , Friesland , Noordholland , Overijssel , Gelderland |
|
, Drenthe , in welke Provinciën dit Elixir als Medicament voor Mond en Tanden |
|
reeds jaren bakend is . |
|
--- |
|
|
|
# Fine-tuned Flair Model on Dutch ICDAR-Europeana NER Dataset |
|
|
|
This Flair model was fine-tuned on the |
|
[Dutch ICDAR-Europeana](https://github.com/stefan-it/historic-domain-adaptation-icdar) |
|
NER Dataset using hmBERT Tiny as backbone LM. |
|
|
|
The ICDAR-Europeana NER Dataset is a preprocessed variant of the |
|
[Europeana NER Corpora](https://github.com/EuropeanaNewspapers/ner-corpora) for Dutch and French. |
|
|
|
The following NEs were annotated: `PER`, `LOC` and `ORG`. |
|
|
|
# Results |
|
|
|
We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration: |
|
|
|
* Batch Sizes: `[4, 8]` |
|
* Learning Rates: `[5e-05, 3e-05]` |
|
|
|
And report micro F1-score on development set: |
|
|
|
| Configuration | Seed 1 | Seed 2 | Seed 3 | Seed 4 | Seed 5 | Average | |
|
|-------------------|------------------|--------------|--------------|--------------|--------------|-----------------| |
|
| `bs4-e10-lr5e-05` | [0.5572][1] | [0.5434][2] | [0.5984][3] | [0.5636][4] | [0.5674][5] | 0.566 ± 0.0203 | |
|
| `bs8-e10-lr5e-05` | [0.5072][6] | [0.5287][7] | [0.5641][8] | [0.5438][9] | [0.5346][10] | 0.5357 ± 0.0208 | |
|
| `bs4-e10-lr3e-05` | [0.519][11] | [0.471][12] | [0.5479][13] | [0.498][14] | [0.4977][15] | 0.5067 ± 0.0286 | |
|
| `bs8-e10-lr3e-05` | [**0.4817**][16] | [0.4511][17] | [0.4956][18] | [0.4627][19] | [0.4715][20] | 0.4725 ± 0.0171 | |
|
|
|
[1]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1 |
|
[2]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2 |
|
[3]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3 |
|
[4]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4 |
|
[5]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5 |
|
[6]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1 |
|
[7]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2 |
|
[8]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3 |
|
[9]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4 |
|
[10]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5 |
|
[11]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1 |
|
[12]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2 |
|
[13]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3 |
|
[14]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4 |
|
[15]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5 |
|
[16]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1 |
|
[17]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2 |
|
[18]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3 |
|
[19]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4 |
|
[20]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5 |
|
|
|
The [training log](training.log) and TensorBoard logs (not available for hmBERT Base model) are also uploaded to the model hub. |
|
|
|
More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench). |
|
|
|
# Acknowledgements |
|
|
|
We thank [Luisa März](https://github.com/LuisaMaerz), [Katharina Schmid](https://github.com/schmika) and |
|
[Erion Çano](https://github.com/erionc) for their fruitful discussions about Historic Language Models. |
|
|
|
Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC). |
|
Many Thanks for providing access to the TPUs ❤️ |
|
|