lombardata commited on
Commit
cadc474
1 Parent(s): 05745a4

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +169 -124
README.md CHANGED
@@ -1,145 +1,190 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
 
6
  model-index:
7
  - name: drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs
8
  results: []
9
  ---
10
 
11
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
- should probably proofread and complete it, then remove this comment. -->
13
 
14
- # drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs
15
 
16
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
17
- It achieves the following results on the evaluation set:
18
  - Loss: 0.4668
19
- - Rmse: 0.1546
20
- - Mae: 0.1143
21
- - Kl Divergence: 0.3931
22
- - Explained Variance: 0.4690
23
- - Learning Rate: 0.0000
 
 
 
24
 
25
- ## Model description
26
 
27
- More information needed
28
 
29
- ## Intended uses & limitations
 
 
 
30
 
31
- More information needed
32
 
33
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
34
 
35
- More information needed
36
 
37
- ## Training procedure
38
 
39
- ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
- - learning_rate: 0.001
43
- - train_batch_size: 32
44
- - eval_batch_size: 32
45
- - seed: 42
46
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
- - lr_scheduler_type: linear
48
- - num_epochs: 150
49
- - mixed_precision_training: Native AMP
50
-
51
- ### Training results
52
-
53
- | Training Loss | Epoch | Step | Validation Loss | Rmse | Mae | Kl Divergence | Explained Variance | Rate |
54
- |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:-------------:|:------------------:|:------:|
55
- | No log | 1.0 | 219 | 0.4855 | 0.1771 | 0.1364 | 0.3101 | 0.3433 | 0.001 |
56
- | No log | 2.0 | 438 | 0.4760 | 0.1688 | 0.1247 | 0.5077 | 0.3891 | 0.001 |
57
- | 0.5195 | 3.0 | 657 | 0.4777 | 0.1707 | 0.1230 | 0.7896 | 0.3848 | 0.001 |
58
- | 0.5195 | 4.0 | 876 | 0.4743 | 0.1672 | 0.1238 | 0.4932 | 0.4037 | 0.001 |
59
- | 0.4742 | 5.0 | 1095 | 0.4746 | 0.1669 | 0.1277 | 0.2901 | 0.4132 | 0.001 |
60
- | 0.4742 | 6.0 | 1314 | 0.4750 | 0.1674 | 0.1253 | 0.4399 | 0.4022 | 0.001 |
61
- | 0.4706 | 7.0 | 1533 | 0.4745 | 0.1671 | 0.1259 | 0.4868 | 0.4020 | 0.001 |
62
- | 0.4706 | 8.0 | 1752 | 0.4742 | 0.1672 | 0.1257 | 0.3241 | 0.4111 | 0.001 |
63
- | 0.4706 | 9.0 | 1971 | 0.4730 | 0.1658 | 0.1236 | 0.4560 | 0.4107 | 0.001 |
64
- | 0.4678 | 10.0 | 2190 | 0.4751 | 0.1679 | 0.1269 | 0.2141 | 0.4190 | 0.001 |
65
- | 0.4678 | 11.0 | 2409 | 0.4733 | 0.1663 | 0.1265 | 0.2530 | 0.4189 | 0.001 |
66
- | 0.4674 | 12.0 | 2628 | 0.4758 | 0.1684 | 0.1264 | 0.3966 | 0.4074 | 0.001 |
67
- | 0.4674 | 13.0 | 2847 | 0.4722 | 0.1650 | 0.1223 | 0.6055 | 0.4142 | 0.001 |
68
- | 0.4676 | 14.0 | 3066 | 0.4747 | 0.1666 | 0.1250 | 0.4203 | 0.4071 | 0.001 |
69
- | 0.4676 | 15.0 | 3285 | 0.4733 | 0.1662 | 0.1227 | 0.6553 | 0.4153 | 0.001 |
70
- | 0.4663 | 16.0 | 3504 | 0.4735 | 0.1656 | 0.1241 | 0.3576 | 0.4176 | 0.001 |
71
- | 0.4663 | 17.0 | 3723 | 0.4722 | 0.1643 | 0.1221 | 0.4545 | 0.4231 | 0.001 |
72
- | 0.4663 | 18.0 | 3942 | 0.4724 | 0.1647 | 0.1225 | 0.4902 | 0.4209 | 0.001 |
73
- | 0.4655 | 19.0 | 4161 | 0.4729 | 0.1650 | 0.1261 | 0.3158 | 0.4224 | 0.001 |
74
- | 0.4655 | 20.0 | 4380 | 0.4697 | 0.1623 | 0.1203 | 0.4574 | 0.4342 | 0.0001 |
75
- | 0.4635 | 21.0 | 4599 | 0.4689 | 0.1613 | 0.1197 | 0.4569 | 0.4383 | 0.0001 |
76
- | 0.4635 | 22.0 | 4818 | 0.4691 | 0.1617 | 0.1202 | 0.4535 | 0.4374 | 0.0001 |
77
- | 0.4615 | 23.0 | 5037 | 0.4691 | 0.1614 | 0.1210 | 0.2971 | 0.4442 | 0.0001 |
78
- | 0.4615 | 24.0 | 5256 | 0.4692 | 0.1616 | 0.1196 | 0.3916 | 0.4406 | 0.0001 |
79
- | 0.4615 | 25.0 | 5475 | 0.4677 | 0.1601 | 0.1181 | 0.4516 | 0.4465 | 0.0001 |
80
- | 0.4601 | 26.0 | 5694 | 0.4680 | 0.1605 | 0.1171 | 0.6089 | 0.4434 | 0.0001 |
81
- | 0.4601 | 27.0 | 5913 | 0.4675 | 0.1600 | 0.1182 | 0.4741 | 0.4461 | 0.0001 |
82
- | 0.4585 | 28.0 | 6132 | 0.4681 | 0.1606 | 0.1200 | 0.3356 | 0.4489 | 0.0001 |
83
- | 0.4585 | 29.0 | 6351 | 0.4678 | 0.1603 | 0.1181 | 0.4330 | 0.4460 | 0.0001 |
84
- | 0.4578 | 30.0 | 6570 | 0.4680 | 0.1602 | 0.1194 | 0.3160 | 0.4504 | 0.0001 |
85
- | 0.4578 | 31.0 | 6789 | 0.4677 | 0.1600 | 0.1179 | 0.4190 | 0.4468 | 0.0001 |
86
- | 0.4579 | 32.0 | 7008 | 0.4675 | 0.1598 | 0.1188 | 0.3706 | 0.4504 | 0.0001 |
87
- | 0.4579 | 33.0 | 7227 | 0.4671 | 0.1593 | 0.1181 | 0.3504 | 0.4546 | 0.0001 |
88
- | 0.4579 | 34.0 | 7446 | 0.4670 | 0.1594 | 0.1180 | 0.3881 | 0.4533 | 0.0001 |
89
- | 0.4569 | 35.0 | 7665 | 0.4663 | 0.1587 | 0.1166 | 0.4398 | 0.4556 | 0.0001 |
90
- | 0.4569 | 36.0 | 7884 | 0.4666 | 0.1587 | 0.1170 | 0.4382 | 0.4544 | 0.0001 |
91
- | 0.4572 | 37.0 | 8103 | 0.4658 | 0.1581 | 0.1163 | 0.4330 | 0.4594 | 0.0001 |
92
- | 0.4572 | 38.0 | 8322 | 0.4659 | 0.1583 | 0.1162 | 0.4878 | 0.4567 | 0.0001 |
93
- | 0.4572 | 39.0 | 8541 | 0.4670 | 0.1595 | 0.1178 | 0.3791 | 0.4552 | 0.0001 |
94
- | 0.4572 | 40.0 | 8760 | 0.4665 | 0.1588 | 0.1178 | 0.3889 | 0.4568 | 0.0001 |
95
- | 0.4572 | 41.0 | 8979 | 0.4666 | 0.1589 | 0.1184 | 0.3222 | 0.4591 | 0.0001 |
96
- | 0.4559 | 42.0 | 9198 | 0.4655 | 0.1579 | 0.1164 | 0.4262 | 0.4607 | 0.0001 |
97
- | 0.4559 | 43.0 | 9417 | 0.4656 | 0.1579 | 0.1162 | 0.4611 | 0.4603 | 0.0001 |
98
- | 0.4554 | 44.0 | 9636 | 0.4656 | 0.1580 | 0.1164 | 0.4586 | 0.4616 | 0.0001 |
99
- | 0.4554 | 45.0 | 9855 | 0.4660 | 0.1583 | 0.1158 | 0.4368 | 0.4597 | 0.0001 |
100
- | 0.4557 | 46.0 | 10074 | 0.4660 | 0.1582 | 0.1164 | 0.4118 | 0.4604 | 0.0001 |
101
- | 0.4557 | 47.0 | 10293 | 0.4652 | 0.1577 | 0.1154 | 0.5424 | 0.4614 | 0.0001 |
102
- | 0.4551 | 48.0 | 10512 | 0.4660 | 0.1586 | 0.1160 | 0.5251 | 0.4596 | 0.0001 |
103
- | 0.4551 | 49.0 | 10731 | 0.4660 | 0.1585 | 0.1161 | 0.5007 | 0.4572 | 0.0001 |
104
- | 0.4551 | 50.0 | 10950 | 0.4666 | 0.1586 | 0.1185 | 0.2424 | 0.4659 | 0.0001 |
105
- | 0.4545 | 51.0 | 11169 | 0.4661 | 0.1584 | 0.1162 | 0.4171 | 0.4589 | 0.0001 |
106
- | 0.4545 | 52.0 | 11388 | 0.4650 | 0.1575 | 0.1155 | 0.4912 | 0.4630 | 0.0001 |
107
- | 0.4548 | 53.0 | 11607 | 0.4654 | 0.1578 | 0.1169 | 0.4030 | 0.4644 | 0.0001 |
108
- | 0.4548 | 54.0 | 11826 | 0.4661 | 0.1585 | 0.1153 | 0.4811 | 0.4595 | 0.0001 |
109
- | 0.455 | 55.0 | 12045 | 0.4653 | 0.1576 | 0.1167 | 0.3774 | 0.4638 | 0.0001 |
110
- | 0.455 | 56.0 | 12264 | 0.4654 | 0.1575 | 0.1176 | 0.3254 | 0.4670 | 0.0001 |
111
- | 0.455 | 57.0 | 12483 | 0.4654 | 0.1575 | 0.1162 | 0.3649 | 0.4662 | 0.0001 |
112
- | 0.4531 | 58.0 | 12702 | 0.4665 | 0.1584 | 0.1166 | 0.4075 | 0.4607 | 0.0001 |
113
- | 0.4531 | 59.0 | 12921 | 0.4652 | 0.1575 | 0.1157 | 0.4202 | 0.4654 | 1e-05 |
114
- | 0.4538 | 60.0 | 13140 | 0.4653 | 0.1571 | 0.1157 | 0.4084 | 0.4669 | 1e-05 |
115
- | 0.4538 | 61.0 | 13359 | 0.4654 | 0.1573 | 0.1153 | 0.4497 | 0.4661 | 1e-05 |
116
- | 0.4529 | 62.0 | 13578 | 0.4648 | 0.1568 | 0.1153 | 0.4112 | 0.4682 | 1e-05 |
117
- | 0.4529 | 63.0 | 13797 | 0.4648 | 0.1567 | 0.1152 | 0.3748 | 0.4702 | 1e-05 |
118
- | 0.4527 | 64.0 | 14016 | 0.4652 | 0.1571 | 0.1162 | 0.3044 | 0.4721 | 1e-05 |
119
- | 0.4527 | 65.0 | 14235 | 0.4648 | 0.1569 | 0.1153 | 0.4685 | 0.4670 | 1e-05 |
120
- | 0.4527 | 66.0 | 14454 | 0.4650 | 0.1573 | 0.1148 | 0.5087 | 0.4671 | 1e-05 |
121
- | 0.4531 | 67.0 | 14673 | 0.4646 | 0.1568 | 0.1155 | 0.4274 | 0.4690 | 1e-05 |
122
- | 0.4531 | 68.0 | 14892 | 0.4646 | 0.1566 | 0.1144 | 0.4969 | 0.4680 | 1e-05 |
123
- | 0.452 | 69.0 | 15111 | 0.4644 | 0.1564 | 0.1145 | 0.4480 | 0.4696 | 1e-05 |
124
- | 0.452 | 70.0 | 15330 | 0.4648 | 0.1567 | 0.1150 | 0.4291 | 0.4692 | 1e-05 |
125
- | 0.4524 | 71.0 | 15549 | 0.4645 | 0.1565 | 0.1156 | 0.3797 | 0.4711 | 1e-05 |
126
- | 0.4524 | 72.0 | 15768 | 0.4647 | 0.1569 | 0.1150 | 0.4280 | 0.4690 | 1e-05 |
127
- | 0.4524 | 73.0 | 15987 | 0.4641 | 0.1563 | 0.1142 | 0.4592 | 0.4707 | 1e-05 |
128
- | 0.4515 | 74.0 | 16206 | 0.4642 | 0.1564 | 0.1151 | 0.4321 | 0.4706 | 1e-05 |
129
- | 0.4515 | 75.0 | 16425 | 0.4645 | 0.1565 | 0.1152 | 0.3843 | 0.4708 | 1e-05 |
130
- | 0.4521 | 76.0 | 16644 | 0.4646 | 0.1569 | 0.1147 | 0.5216 | 0.4675 | 1e-05 |
131
- | 0.4521 | 77.0 | 16863 | 0.4648 | 0.1569 | 0.1152 | 0.4094 | 0.4691 | 1e-05 |
132
- | 0.4519 | 78.0 | 17082 | 0.4643 | 0.1564 | 0.1149 | 0.4399 | 0.4709 | 1e-05 |
133
- | 0.4519 | 79.0 | 17301 | 0.4646 | 0.1567 | 0.1147 | 0.4178 | 0.4697 | 1e-05 |
134
- | 0.4517 | 80.0 | 17520 | 0.4644 | 0.1564 | 0.1150 | 0.4373 | 0.4700 | 0.0000 |
135
- | 0.4517 | 81.0 | 17739 | 0.4645 | 0.1567 | 0.1151 | 0.4701 | 0.4688 | 0.0000 |
136
- | 0.4517 | 82.0 | 17958 | 0.4644 | 0.1565 | 0.1146 | 0.4601 | 0.4703 | 0.0000 |
137
- | 0.4514 | 83.0 | 18177 | 0.4646 | 0.1567 | 0.1147 | 0.4511 | 0.4684 | 0.0000 |
138
-
139
-
140
- ### Framework versions
141
-
142
- - Transformers 4.41.0
143
- - Pytorch 2.5.0+cu124
144
- - Datasets 3.0.2
145
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: cc0-1.0
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs
11
  model-index:
12
  - name: drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs
13
  results: []
14
  ---
15
 
16
+ drone-DinoVdeau-from-probs is a fine-tuned version of [drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs](https://huggingface.co/drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs). It achieves the following results on the test set:
 
17
 
 
18
 
 
 
19
  - Loss: 0.4668
20
+ - RMSE: 0.1546
21
+ - MAE: 0.1143
22
+ - KL Divergence: 0.3931
23
+
24
+ ---
25
+
26
+ # Model description
27
+ drone-DinoVdeau-from-probs is a model built on top of drone-DinoVdeau-from-probs-large-2024_11_15-batch-size32_freeze_probs model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
 
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the estimated number of images for each class are given in the following table:
42
+ | Class | train | test | val | Total |
43
+ |:------------------------|--------:|-------:|------:|--------:|
44
+ | Acropore_branched | 1220 | 363 | 362 | 1945 |
45
+ | Acropore_digitised | 586 | 195 | 189 | 970 |
46
+ | Acropore_tabular | 308 | 133 | 119 | 560 |
47
+ | Algae | 4777 | 1372 | 1384 | 7533 |
48
+ | Dead_coral | 2513 | 671 | 693 | 3877 |
49
+ | Millepore | 136 | 55 | 59 | 250 |
50
+ | No_acropore_encrusting | 252 | 88 | 93 | 433 |
51
+ | No_acropore_massive | 2158 | 725 | 726 | 3609 |
52
+ | No_acropore_sub_massive | 2036 | 582 | 612 | 3230 |
53
+ | Rock | 5976 | 1941 | 1928 | 9845 |
54
+ | Rubble | 4851 | 1486 | 1474 | 7811 |
55
+ | Sand | 6155 | 2019 | 1990 | 10164 |
56
 
57
+ ---
58
 
59
+ # Training procedure
60
 
61
+ ## Training hyperparameters
62
 
63
  The following hyperparameters were used during training:
64
+
65
+ - **Number of Epochs**: 83.0
66
+ - **Learning Rate**: 0.001
67
+ - **Train Batch Size**: 32
68
+ - **Eval Batch Size**: 32
69
+ - **Optimizer**: Adam
70
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
71
+ - **Freeze Encoder**: Yes
72
+ - **Data Augmentation**: Yes
73
+
74
+
75
+ ## Data Augmentation
76
+ Data were augmented using the following transformations :
77
+
78
+ Train Transforms
79
+ - **PreProcess**: No additional parameters
80
+ - **Resize**: probability=1.00
81
+ - **RandomHorizontalFlip**: probability=0.25
82
+ - **RandomVerticalFlip**: probability=0.25
83
+ - **ColorJiggle**: probability=0.25
84
+ - **RandomPerspective**: probability=0.25
85
+ - **Normalize**: probability=1.00
86
+
87
+ Val Transforms
88
+ - **PreProcess**: No additional parameters
89
+ - **Resize**: probability=1.00
90
+ - **Normalize**: probability=1.00
91
+
92
+
93
+
94
+ ## Training results
95
+ Epoch | Validation Loss | MAE | RMSE | KL div | Learning Rate
96
+ --- | --- | --- | --- | --- | ---
97
+ 1 | 0.4855400025844574 | 0.1364 | 0.1771 | 0.3101 | 0.001
98
+ 2 | 0.47601452469825745 | 0.1247 | 0.1688 | 0.5077 | 0.001
99
+ 3 | 0.4776814579963684 | 0.1230 | 0.1707 | 0.7896 | 0.001
100
+ 4 | 0.47429159283638 | 0.1238 | 0.1672 | 0.4932 | 0.001
101
+ 5 | 0.47457176446914673 | 0.1277 | 0.1669 | 0.2901 | 0.001
102
+ 6 | 0.4749792814254761 | 0.1253 | 0.1674 | 0.4399 | 0.001
103
+ 7 | 0.4744807779788971 | 0.1259 | 0.1671 | 0.4868 | 0.001
104
+ 8 | 0.47424906492233276 | 0.1257 | 0.1672 | 0.3241 | 0.001
105
+ 9 | 0.4729686379432678 | 0.1236 | 0.1658 | 0.4560 | 0.001
106
+ 10 | 0.4750550389289856 | 0.1269 | 0.1679 | 0.2141 | 0.001
107
+ 11 | 0.4733181595802307 | 0.1265 | 0.1663 | 0.2530 | 0.001
108
+ 12 | 0.4758349061012268 | 0.1264 | 0.1684 | 0.3966 | 0.001
109
+ 13 | 0.4722050428390503 | 0.1223 | 0.1650 | 0.6055 | 0.001
110
+ 14 | 0.4747372567653656 | 0.1250 | 0.1666 | 0.4203 | 0.001
111
+ 15 | 0.47325292229652405 | 0.1227 | 0.1662 | 0.6553 | 0.001
112
+ 16 | 0.4734710156917572 | 0.1241 | 0.1656 | 0.3576 | 0.001
113
+ 17 | 0.4721581041812897 | 0.1221 | 0.1643 | 0.4545 | 0.001
114
+ 18 | 0.4723944365978241 | 0.1225 | 0.1647 | 0.4902 | 0.001
115
+ 19 | 0.47289156913757324 | 0.1261 | 0.1650 | 0.3158 | 0.001
116
+ 20 | 0.4697262644767761 | 0.1203 | 0.1623 | 0.4574 | 0.0001
117
+ 21 | 0.46890661120414734 | 0.1197 | 0.1613 | 0.4569 | 0.0001
118
+ 22 | 0.46905258297920227 | 0.1202 | 0.1617 | 0.4535 | 0.0001
119
+ 23 | 0.4691086411476135 | 0.1210 | 0.1614 | 0.2971 | 0.0001
120
+ 24 | 0.46915334463119507 | 0.1196 | 0.1616 | 0.3916 | 0.0001
121
+ 25 | 0.4676876664161682 | 0.1181 | 0.1601 | 0.4516 | 0.0001
122
+ 26 | 0.4679708480834961 | 0.1171 | 0.1605 | 0.6089 | 0.0001
123
+ 27 | 0.4674595892429352 | 0.1182 | 0.1600 | 0.4741 | 0.0001
124
+ 28 | 0.46810340881347656 | 0.1200 | 0.1606 | 0.3356 | 0.0001
125
+ 29 | 0.4678303897380829 | 0.1181 | 0.1603 | 0.4330 | 0.0001
126
+ 30 | 0.46800243854522705 | 0.1194 | 0.1602 | 0.3160 | 0.0001
127
+ 31 | 0.4676785469055176 | 0.1179 | 0.1600 | 0.4190 | 0.0001
128
+ 32 | 0.46752873063087463 | 0.1188 | 0.1598 | 0.3706 | 0.0001
129
+ 33 | 0.46710190176963806 | 0.1181 | 0.1593 | 0.3504 | 0.0001
130
+ 34 | 0.4670344293117523 | 0.1180 | 0.1594 | 0.3881 | 0.0001
131
+ 35 | 0.4662601053714752 | 0.1166 | 0.1587 | 0.4398 | 0.0001
132
+ 36 | 0.46657058596611023 | 0.1170 | 0.1587 | 0.4382 | 0.0001
133
+ 37 | 0.4657588005065918 | 0.1163 | 0.1581 | 0.4330 | 0.0001
134
+ 38 | 0.4659184217453003 | 0.1162 | 0.1583 | 0.4878 | 0.0001
135
+ 39 | 0.46703553199768066 | 0.1178 | 0.1595 | 0.3791 | 0.0001
136
+ 40 | 0.4664987027645111 | 0.1178 | 0.1588 | 0.3889 | 0.0001
137
+ 41 | 0.46659526228904724 | 0.1184 | 0.1589 | 0.3222 | 0.0001
138
+ 42 | 0.4655005633831024 | 0.1164 | 0.1579 | 0.4262 | 0.0001
139
+ 43 | 0.4656265676021576 | 0.1162 | 0.1579 | 0.4611 | 0.0001
140
+ 44 | 0.4655725955963135 | 0.1164 | 0.1580 | 0.4586 | 0.0001
141
+ 45 | 0.46600833535194397 | 0.1158 | 0.1583 | 0.4368 | 0.0001
142
+ 46 | 0.4660418927669525 | 0.1164 | 0.1582 | 0.4118 | 0.0001
143
+ 47 | 0.46521857380867004 | 0.1154 | 0.1577 | 0.5424 | 0.0001
144
+ 48 | 0.46598610281944275 | 0.1160 | 0.1586 | 0.5251 | 0.0001
145
+ 49 | 0.46604350209236145 | 0.1161 | 0.1585 | 0.5007 | 0.0001
146
+ 50 | 0.46660009026527405 | 0.1185 | 0.1586 | 0.2424 | 0.0001
147
+ 51 | 0.4660661220550537 | 0.1162 | 0.1584 | 0.4171 | 0.0001
148
+ 52 | 0.4649689793586731 | 0.1155 | 0.1575 | 0.4912 | 0.0001
149
+ 53 | 0.4653578996658325 | 0.1169 | 0.1578 | 0.4030 | 0.0001
150
+ 54 | 0.4660585820674896 | 0.1153 | 0.1585 | 0.4811 | 0.0001
151
+ 55 | 0.46527624130249023 | 0.1167 | 0.1576 | 0.3774 | 0.0001
152
+ 56 | 0.4654240906238556 | 0.1176 | 0.1575 | 0.3254 | 0.0001
153
+ 57 | 0.4654492139816284 | 0.1162 | 0.1575 | 0.3649 | 0.0001
154
+ 58 | 0.46654412150382996 | 0.1166 | 0.1584 | 0.4075 | 0.0001
155
+ 59 | 0.465238481760025 | 0.1157 | 0.1575 | 0.4202 | 1e-05
156
+ 60 | 0.46530231833457947 | 0.1157 | 0.1571 | 0.4084 | 1e-05
157
+ 61 | 0.4653523564338684 | 0.1153 | 0.1573 | 0.4497 | 1e-05
158
+ 62 | 0.46477487683296204 | 0.1153 | 0.1568 | 0.4112 | 1e-05
159
+ 63 | 0.46481335163116455 | 0.1152 | 0.1567 | 0.3748 | 1e-05
160
+ 64 | 0.46523070335388184 | 0.1162 | 0.1571 | 0.3044 | 1e-05
161
+ 65 | 0.46484872698783875 | 0.1153 | 0.1569 | 0.4685 | 1e-05
162
+ 66 | 0.46500927209854126 | 0.1148 | 0.1573 | 0.5087 | 1e-05
163
+ 67 | 0.4645930230617523 | 0.1155 | 0.1568 | 0.4274 | 1e-05
164
+ 68 | 0.46456360816955566 | 0.1144 | 0.1566 | 0.4969 | 1e-05
165
+ 69 | 0.464430034160614 | 0.1145 | 0.1564 | 0.4480 | 1e-05
166
+ 70 | 0.4648461937904358 | 0.1150 | 0.1567 | 0.4291 | 1e-05
167
+ 71 | 0.4645022749900818 | 0.1156 | 0.1565 | 0.3797 | 1e-05
168
+ 72 | 0.46473589539527893 | 0.1150 | 0.1569 | 0.4280 | 1e-05
169
+ 73 | 0.46414923667907715 | 0.1142 | 0.1563 | 0.4592 | 1e-05
170
+ 74 | 0.4641610085964203 | 0.1151 | 0.1564 | 0.4321 | 1e-05
171
+ 75 | 0.4644509255886078 | 0.1152 | 0.1565 | 0.3843 | 1e-05
172
+ 76 | 0.4646488130092621 | 0.1147 | 0.1569 | 0.5216 | 1e-05
173
+ 77 | 0.46475714445114136 | 0.1152 | 0.1569 | 0.4094 | 1e-05
174
+ 78 | 0.46428272128105164 | 0.1149 | 0.1564 | 0.4399 | 1e-05
175
+ 79 | 0.4645934998989105 | 0.1147 | 0.1567 | 0.4178 | 1e-05
176
+ 80 | 0.46436014771461487 | 0.1150 | 0.1564 | 0.4373 | 1.0000000000000002e-06
177
+ 81 | 0.46448636054992676 | 0.1151 | 0.1567 | 0.4701 | 1.0000000000000002e-06
178
+ 82 | 0.4644375145435333 | 0.1146 | 0.1565 | 0.4601 | 1.0000000000000002e-06
179
+ 83 | 0.46457409858703613 | 0.1147 | 0.1567 | 0.4511 | 1.0000000000000002e-06
180
+
181
+
182
+ ---
183
+
184
+ # Framework Versions
185
+
186
+ - **Transformers**: 4.41.0
187
+ - **Pytorch**: 2.5.0+cu124
188
+ - **Datasets**: 3.0.2
189
+ - **Tokenizers**: 0.19.1
190
+