disk0dancer's picture
Update README.md
a6ed218 verified
---
tags:
- generated_from_trainer
model-index:
- name: ruBert-base-finetuned-pos
results: []
license: mit
datasets:
- disk0dancer/ru_sentances_pos
language:
- ru
metrics:
- accuracy
- f1
pipeline_tag: token-classification
library_name: transformers
---
# ruBert-base-finetuned-pos
This model was finetuned from [ai-forever/ruBert-base](https://huggingface.co/ai-forever/ruBert-base) on the [disk0dancer/ru_sentances_pos](https://hf.co/datasets/disk0dancer/ru_sentances_pos) dataset.
All docs and code can be found on [Github](https://github.com/disk0Dancer/rubert-finetuned-pos).
It achieves the following results on the evaluation set:
- eval_loss: 0.1544
- eval_precision: 0.8561
- eval_recall: 0.8723
- eval_f1: 0.8642
- eval_accuracy: 0.8822
- eval_runtime: 0.2476
- eval_samples_per_second: 80.775
- eval_steps_per_second: 8.078
- step: 0
## Model description
Bert + Dence + Softmax + Dropout
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f73a86f9678931cad645df/fnHI0M7WAQ1AkgfXOTIx6.png)
## Training and evaluation data
Model Trained for Token Classification
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.39.0
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
## Cite
```
@misc{churakov2024postagginghighlightskeletalstructure,
title={POS-tagging to highlight the skeletal structure of sentences},
author={Grigorii Churakov},
year={2024},
eprint={2411.14393},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2411.14393},
}
```