amichailidis
commited on
Commit
•
3ce4e4e
1
Parent(s):
58f4a51
Update README.md
Browse files
README.md
CHANGED
@@ -68,32 +68,4 @@ The following hyperparameters were used during training:
|
|
68 |
- Transformers 4.23.0
|
69 |
- Pytorch 1.12.1+cu113
|
70 |
- Datasets 2.5.2
|
71 |
-
- Tokenizers 0.13.1
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
This model is a fine-tuned version of [alexaapo/greek_legal_bert_v2](https://huggingface.co/alexaapo/greek_legal_bert_v2) on an unknown dataset.
|
76 |
-
It achieves the following results on the evaluation set:
|
77 |
-
- Loss: 0.1262
|
78 |
-
- Precision: 0.8363
|
79 |
-
- Recall: 0.8610
|
80 |
-
- F1: 0.8485
|
81 |
-
- Accuracy: 0.9758
|
82 |
-
|
83 |
-
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|
84 |
-
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
|
85 |
-
| No log | 0.64 | 250 | 0.0911 | 0.7814 | 0.8596 | 0.8187 | 0.9717 |
|
86 |
-
| 0.1136 | 1.29 | 500 | 0.0857 | 0.7940 | 0.8738 | 0.8320 | 0.9731 |
|
87 |
-
| 0.1136 | 1.93 | 750 | 0.0890 | 0.8057 | 0.8645 | 0.8341 | 0.9737 |
|
88 |
-
| 0.0521 | 2.58 | 1000 | 0.0896 | 0.8244 | 0.8610 | 0.8423 | 0.9752 |
|
89 |
-
| 0.0521 | 3.22 | 1250 | 0.0933 | 0.8329 | 0.8627 | 0.8476 | 0.9762 |
|
90 |
-
| 0.0352 | 3.87 | 1500 | 0.0926 | 0.8286 | 0.8614 | 0.8447 | 0.9755 |
|
91 |
-
| 0.0352 | 4.51 | 1750 | 0.1049 | 0.8446 | 0.8528 | 0.8487 | 0.9751 |
|
92 |
-
| 0.023 | 5.15 | 2000 | 0.1093 | 0.8381 | 0.8586 | 0.8483 | 0.9759 |
|
93 |
-
| 0.023 | 5.8 | 2250 | 0.1172 | 0.8291 | 0.8614 | 0.8449 | 0.9758 |
|
94 |
-
| 0.0158 | 6.44 | 2500 | 0.1273 | 0.8189 | 0.8727 | 0.8450 | 0.9758 |
|
95 |
-
| 0.0158 | 7.09 | 2750 | 0.1246 | 0.8270 | 0.8648 | 0.8455 | 0.9757 |
|
96 |
-
| 0.0126 | 7.73 | 3000 | 0.1262 | 0.8363 | 0.8610 | 0.8485 | 0.9758 |
|
97 |
-
| 0.0126 | 8.38 | 3250 | 0.1347 | 0.8247 | 0.8707 | 0.8471 | 0.9753 |
|
98 |
-
| 0.0089 | 9.02 | 3500 | 0.1325 | 0.8316 | 0.8559 | 0.8435 | 0.9757 |
|
99 |
-
| 0.0089 | 9.66 | 3750 | 0.1362 | 0.8293 | 0.8638 | 0.8462 | 0.9759 |
|
|
|
68 |
- Transformers 4.23.0
|
69 |
- Pytorch 1.12.1+cu113
|
70 |
- Datasets 2.5.2
|
71 |
+
- Tokenizers 0.13.1
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|