metadata
license: apache-2.0
base_model: facebook/dinov2-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dinov2-base-2024_09_09-batch-size32_epochs150_freeze
results: []
dinov2-base-2024_09_09-batch-size32_epochs150_freeze
This model is a fine-tuned version of facebook/dinov2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1321
- F1 Micro: 0.8069
- F1 Macro: 0.7121
- Roc Auc: 0.8742
- Accuracy: 0.2869
- Learning Rate: 0.0000
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 273 | 0.1601 | 0.7634 | 0.6251 | 0.8453 | 0.2328 | 0.001 |
0.1759 | 2.0 | 546 | 0.1504 | 0.7780 | 0.6462 | 0.8546 | 0.2498 | 0.001 |
0.1759 | 3.0 | 819 | 0.1483 | 0.7817 | 0.6644 | 0.8583 | 0.2564 | 0.001 |
0.1474 | 4.0 | 1092 | 0.1464 | 0.7863 | 0.6809 | 0.8634 | 0.2554 | 0.001 |
0.1474 | 5.0 | 1365 | 0.1423 | 0.7891 | 0.6919 | 0.8572 | 0.2682 | 0.001 |
0.1397 | 6.0 | 1638 | 0.1440 | 0.7902 | 0.6988 | 0.8629 | 0.2651 | 0.001 |
0.1397 | 7.0 | 1911 | 0.1425 | 0.7938 | 0.6850 | 0.8647 | 0.2682 | 0.001 |
0.1356 | 8.0 | 2184 | 0.1429 | 0.7931 | 0.6880 | 0.8700 | 0.2637 | 0.001 |
0.1356 | 9.0 | 2457 | 0.1463 | 0.7927 | 0.6885 | 0.8704 | 0.2557 | 0.001 |
0.1315 | 10.0 | 2730 | 0.1392 | 0.8009 | 0.7050 | 0.8729 | 0.2744 | 0.001 |
0.1308 | 11.0 | 3003 | 0.1443 | 0.7853 | 0.6892 | 0.8519 | 0.2699 | 0.001 |
0.1308 | 12.0 | 3276 | 0.1452 | 0.7888 | 0.6976 | 0.8670 | 0.2713 | 0.001 |
0.1277 | 13.0 | 3549 | 0.1370 | 0.8007 | 0.7032 | 0.8680 | 0.2765 | 0.001 |
0.1277 | 14.0 | 3822 | 0.1401 | 0.7984 | 0.6875 | 0.8694 | 0.2730 | 0.001 |
0.1257 | 15.0 | 4095 | 0.1379 | 0.8049 | 0.7001 | 0.8748 | 0.2817 | 0.001 |
0.1257 | 16.0 | 4368 | 0.1429 | 0.7969 | 0.7063 | 0.8675 | 0.2682 | 0.001 |
0.1257 | 17.0 | 4641 | 0.1451 | 0.7956 | 0.6861 | 0.8728 | 0.2613 | 0.001 |
0.1257 | 18.0 | 4914 | 0.1418 | 0.7906 | 0.6849 | 0.8574 | 0.2713 | 0.001 |
0.1251 | 19.0 | 5187 | 0.1438 | 0.7900 | 0.6794 | 0.8556 | 0.2654 | 0.001 |
0.1251 | 20.0 | 5460 | 0.1319 | 0.8068 | 0.7202 | 0.8705 | 0.2866 | 0.0001 |
0.1161 | 21.0 | 5733 | 0.1312 | 0.8081 | 0.7237 | 0.8715 | 0.2876 | 0.0001 |
0.1109 | 22.0 | 6006 | 0.1310 | 0.8101 | 0.7222 | 0.8788 | 0.2935 | 0.0001 |
0.1109 | 23.0 | 6279 | 0.1305 | 0.8120 | 0.7226 | 0.8776 | 0.2935 | 0.0001 |
0.1103 | 24.0 | 6552 | 0.1309 | 0.8096 | 0.7238 | 0.8769 | 0.2952 | 0.0001 |
0.1103 | 25.0 | 6825 | 0.1308 | 0.8093 | 0.7171 | 0.8735 | 0.2949 | 0.0001 |
0.1099 | 26.0 | 7098 | 0.1301 | 0.8100 | 0.7200 | 0.8745 | 0.2911 | 0.0001 |
0.1099 | 27.0 | 7371 | 0.1303 | 0.8082 | 0.7208 | 0.8727 | 0.2924 | 0.0001 |
0.1107 | 28.0 | 7644 | 0.1302 | 0.8103 | 0.7218 | 0.8752 | 0.2970 | 0.0001 |
0.1107 | 29.0 | 7917 | 0.1302 | 0.8104 | 0.7237 | 0.8766 | 0.2963 | 0.0001 |
0.1103 | 30.0 | 8190 | 0.1303 | 0.8097 | 0.7181 | 0.8745 | 0.2956 | 0.0001 |
0.1103 | 31.0 | 8463 | 0.1301 | 0.8092 | 0.7190 | 0.8739 | 0.2959 | 0.0001 |
0.1104 | 32.0 | 8736 | 0.1301 | 0.8098 | 0.7210 | 0.8740 | 0.2928 | 0.0001 |
0.1093 | 33.0 | 9009 | 0.1296 | 0.8100 | 0.7204 | 0.8738 | 0.2963 | 1e-05 |
0.1093 | 34.0 | 9282 | 0.1296 | 0.8101 | 0.7222 | 0.8745 | 0.2956 | 1e-05 |
0.1084 | 35.0 | 9555 | 0.1295 | 0.8109 | 0.7220 | 0.8758 | 0.2956 | 1e-05 |
0.1084 | 36.0 | 9828 | 0.1295 | 0.8105 | 0.7212 | 0.8746 | 0.2931 | 1e-05 |
0.1091 | 37.0 | 10101 | 0.1295 | 0.8119 | 0.7239 | 0.8757 | 0.2963 | 1e-05 |
0.1091 | 38.0 | 10374 | 0.1295 | 0.8104 | 0.7213 | 0.8744 | 0.2959 | 1e-05 |
0.1075 | 39.0 | 10647 | 0.1295 | 0.8106 | 0.7222 | 0.8752 | 0.2966 | 1e-05 |
0.1075 | 40.0 | 10920 | 0.1295 | 0.8113 | 0.7233 | 0.8768 | 0.2956 | 1e-05 |
0.1088 | 41.0 | 11193 | 0.1295 | 0.8100 | 0.7223 | 0.8739 | 0.2945 | 1e-05 |
0.1088 | 42.0 | 11466 | 0.1295 | 0.8111 | 0.7219 | 0.8750 | 0.2973 | 1e-05 |
0.1085 | 43.0 | 11739 | 0.1294 | 0.8098 | 0.7212 | 0.8738 | 0.2931 | 1e-05 |
0.1084 | 44.0 | 12012 | 0.1295 | 0.8108 | 0.7212 | 0.8746 | 0.2970 | 1e-05 |
0.1084 | 45.0 | 12285 | 0.1294 | 0.8104 | 0.7218 | 0.8749 | 0.2945 | 1e-05 |
0.1083 | 46.0 | 12558 | 0.1294 | 0.8113 | 0.7233 | 0.8759 | 0.2976 | 1e-05 |
0.1083 | 47.0 | 12831 | 0.1294 | 0.8107 | 0.7229 | 0.8753 | 0.2945 | 1e-05 |
0.109 | 48.0 | 13104 | 0.1294 | 0.8103 | 0.7209 | 0.8742 | 0.2956 | 1e-05 |
0.109 | 49.0 | 13377 | 0.1293 | 0.8111 | 0.7215 | 0.8755 | 0.2959 | 1e-05 |
0.108 | 50.0 | 13650 | 0.1294 | 0.8107 | 0.7211 | 0.8750 | 0.2966 | 1e-05 |
0.108 | 51.0 | 13923 | 0.1294 | 0.8099 | 0.7224 | 0.8742 | 0.2924 | 1e-05 |
0.1084 | 52.0 | 14196 | 0.1294 | 0.8110 | 0.7224 | 0.8755 | 0.2973 | 1e-05 |
0.1084 | 53.0 | 14469 | 0.1295 | 0.8111 | 0.7225 | 0.8757 | 0.2980 | 1e-05 |
0.1086 | 54.0 | 14742 | 0.1294 | 0.8105 | 0.7222 | 0.8752 | 0.2963 | 1e-05 |
0.1083 | 55.0 | 15015 | 0.1293 | 0.8107 | 0.7231 | 0.8754 | 0.2956 | 1e-05 |
0.1083 | 56.0 | 15288 | 0.1294 | 0.8107 | 0.7227 | 0.8753 | 0.2959 | 0.0000 |
0.108 | 57.0 | 15561 | 0.1293 | 0.8111 | 0.7231 | 0.8754 | 0.2956 | 0.0000 |
0.108 | 58.0 | 15834 | 0.1294 | 0.8112 | 0.7230 | 0.8755 | 0.2966 | 0.0000 |
0.1089 | 59.0 | 16107 | 0.1294 | 0.8110 | 0.7227 | 0.8753 | 0.2966 | 0.0000 |
Framework versions
- Transformers 4.41.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1