File size: 4,644 Bytes
487b1ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 |
---
language: fr
license: mit
tags:
- flair
- token-classification
- sequence-tagger-model
base_model: dbmdz/bert-tiny-historic-multilingual-cased
widget:
- text: 'Parmi les remèdes recommandés par la Société , il faut mentionner celui que
M . Schatzmann , de Lausanne , a proposé :'
---
# Fine-tuned Flair Model on LeTemps French NER Dataset (HIPE-2022)
This Flair model was fine-tuned on the
[LeTemps French](https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-letemps.md)
NER Dataset using hmBERT Tiny as backbone LM.
The LeTemps dataset consists of NE-annotated historical French newspaper articles from mid-19C to mid 20C.
The following NEs were annotated: `loc`, `org` and `pers`.
# Results
We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:
* Batch Sizes: `[4, 8]`
* Learning Rates: `[5e-05, 3e-05]`
And report micro F1-score on development set:
| Configuration | Seed 1 | Seed 2 | Seed 3 | Seed 4 | Seed 5 | Average |
|-------------------|--------------|--------------|-----------------|--------------|--------------|-----------------|
| `bs4-e10-lr5e-05` | [0.5297][1] | [0.5073][2] | [0.5106][3] | [0.5111][4] | [0.5282][5] | 0.5174 ± 0.0107 |
| `bs4-e10-lr3e-05` | [0.5012][6] | [0.4703][7] | [**0.5019**][8] | [0.4857][9] | [0.5072][10] | 0.4933 ± 0.0151 |
| `bs8-e10-lr5e-05` | [0.5027][11] | [0.4727][12] | [0.5021][13] | [0.4912][14] | [0.4733][15] | 0.4884 ± 0.0148 |
| `bs8-e10-lr3e-05` | [0.489][16] | [0.4226][17] | [0.4656][18] | [0.4744][19] | [0.4511][20] | 0.4605 ± 0.0253 |
[1]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[2]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[3]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[4]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[5]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[6]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[7]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[8]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[9]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[10]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
[11]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[12]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[13]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[14]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[15]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[16]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[17]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[18]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[19]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[20]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbert_tiny-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
The [training log](training.log) and TensorBoard logs (not available for hmBERT Base model) are also uploaded to the model hub.
More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
# Acknowledgements
We thank [Luisa März](https://github.com/LuisaMaerz), [Katharina Schmid](https://github.com/schmika) and
[Erion Çano](https://github.com/erionc) for their fruitful discussions about Historic Language Models.
Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
Many Thanks for providing access to the TPUs ❤️
|