clip-roberta-finetuned
This model is a fine-tuned version of ./clip-roberta on the davanstrien/manuscript_noisy_labels_iiif dataset. It achieves the following results on the evaluation set:
- Loss: 2.5792
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 256
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.9841 | 0.07 | 500 | 3.4112 |
2.72 | 0.15 | 1000 | 3.3430 |
2.6319 | 0.22 | 1500 | 3.2295 |
2.5781 | 0.29 | 2000 | 3.1645 |
2.5339 | 0.36 | 2500 | 3.1226 |
2.503 | 0.44 | 3000 | 3.0856 |
2.4581 | 0.51 | 3500 | 3.0639 |
2.4494 | 0.58 | 4000 | 3.0415 |
2.4275 | 0.65 | 4500 | 3.0245 |
2.3909 | 0.73 | 5000 | 2.9991 |
2.3902 | 0.8 | 5500 | 2.9931 |
2.3741 | 0.87 | 6000 | 2.9612 |
2.3536 | 0.95 | 6500 | 2.9509 |
2.3392 | 1.02 | 7000 | 2.9289 |
2.3083 | 1.09 | 7500 | 2.9214 |
2.3094 | 1.16 | 8000 | 2.9153 |
2.2864 | 1.24 | 8500 | 2.9034 |
2.2893 | 1.31 | 9000 | 2.8963 |
2.2697 | 1.38 | 9500 | 2.8847 |
2.2762 | 1.46 | 10000 | 2.8665 |
2.2667 | 1.53 | 10500 | 2.8536 |
2.2548 | 1.6 | 11000 | 2.8472 |
2.238 | 1.67 | 11500 | 2.8491 |
2.2423 | 1.75 | 12000 | 2.8257 |
2.2406 | 1.82 | 12500 | 2.8287 |
2.2248 | 1.89 | 13000 | 2.8193 |
2.223 | 1.96 | 13500 | 2.8101 |
2.1995 | 2.04 | 14000 | 2.8027 |
2.1834 | 2.11 | 14500 | 2.7880 |
2.1723 | 2.18 | 15000 | 2.7783 |
2.1651 | 2.26 | 15500 | 2.7739 |
2.1575 | 2.33 | 16000 | 2.7825 |
2.1598 | 2.4 | 16500 | 2.7660 |
2.1667 | 2.47 | 17000 | 2.7578 |
2.1565 | 2.55 | 17500 | 2.7580 |
2.1558 | 2.62 | 18000 | 2.7561 |
2.1642 | 2.69 | 18500 | 2.7512 |
2.1374 | 2.77 | 19000 | 2.7361 |
2.1402 | 2.84 | 19500 | 2.7385 |
2.1326 | 2.91 | 20000 | 2.7235 |
2.1272 | 2.98 | 20500 | 2.7183 |
2.0954 | 3.06 | 21000 | 2.7156 |
2.0842 | 3.13 | 21500 | 2.7065 |
2.0859 | 3.2 | 22000 | 2.7089 |
2.0856 | 3.27 | 22500 | 2.6962 |
2.0775 | 3.35 | 23000 | 2.6931 |
2.0821 | 3.42 | 23500 | 2.6933 |
2.0706 | 3.49 | 24000 | 2.7011 |
2.0689 | 3.57 | 24500 | 2.7009 |
2.0807 | 3.64 | 25000 | 2.6825 |
2.0639 | 3.71 | 25500 | 2.6744 |
2.0742 | 3.78 | 26000 | 2.6777 |
2.0789 | 3.86 | 26500 | 2.6689 |
2.0594 | 3.93 | 27000 | 2.6566 |
2.056 | 4.0 | 27500 | 2.6676 |
2.0223 | 4.08 | 28000 | 2.6711 |
2.0185 | 4.15 | 28500 | 2.6568 |
2.018 | 4.22 | 29000 | 2.6567 |
2.0036 | 4.29 | 29500 | 2.6545 |
2.0238 | 4.37 | 30000 | 2.6559 |
2.0091 | 4.44 | 30500 | 2.6450 |
2.0096 | 4.51 | 31000 | 2.6389 |
2.0083 | 4.58 | 31500 | 2.6401 |
2.0012 | 4.66 | 32000 | 2.6399 |
2.0166 | 4.73 | 32500 | 2.6289 |
1.9963 | 4.8 | 33000 | 2.6348 |
1.9943 | 4.88 | 33500 | 2.6240 |
2.0099 | 4.95 | 34000 | 2.6190 |
1.9895 | 5.02 | 34500 | 2.6308 |
1.9581 | 5.09 | 35000 | 2.6385 |
1.9502 | 5.17 | 35500 | 2.6237 |
1.9485 | 5.24 | 36000 | 2.6248 |
1.9643 | 5.31 | 36500 | 2.6279 |
1.9535 | 5.38 | 37000 | 2.6185 |
1.9575 | 5.46 | 37500 | 2.6146 |
1.9475 | 5.53 | 38000 | 2.6093 |
1.9434 | 5.6 | 38500 | 2.6090 |
1.954 | 5.68 | 39000 | 2.6027 |
1.9509 | 5.75 | 39500 | 2.6107 |
1.9454 | 5.82 | 40000 | 2.5980 |
1.9479 | 5.89 | 40500 | 2.6016 |
1.9539 | 5.97 | 41000 | 2.5971 |
1.9119 | 6.04 | 41500 | 2.6228 |
1.8974 | 6.11 | 42000 | 2.6169 |
1.9038 | 6.19 | 42500 | 2.6027 |
1.9008 | 6.26 | 43000 | 2.6027 |
1.9142 | 6.33 | 43500 | 2.6011 |
1.8783 | 6.4 | 44000 | 2.5960 |
1.8896 | 6.48 | 44500 | 2.6111 |
1.8975 | 6.55 | 45000 | 2.5889 |
1.9048 | 6.62 | 45500 | 2.6007 |
1.9049 | 6.69 | 46000 | 2.5972 |
1.8969 | 6.77 | 46500 | 2.6053 |
1.9105 | 6.84 | 47000 | 2.5893 |
1.8921 | 6.91 | 47500 | 2.5883 |
1.8918 | 6.99 | 48000 | 2.5792 |
1.8671 | 7.06 | 48500 | 2.6041 |
1.8551 | 7.13 | 49000 | 2.6070 |
1.8555 | 7.2 | 49500 | 2.6148 |
1.8543 | 7.28 | 50000 | 2.6077 |
1.8485 | 7.35 | 50500 | 2.6131 |
1.8474 | 7.42 | 51000 | 2.6039 |
1.8474 | 7.5 | 51500 | 2.5973 |
1.8442 | 7.57 | 52000 | 2.5946 |
1.8329 | 7.64 | 52500 | 2.6069 |
1.8551 | 7.71 | 53000 | 2.5923 |
1.8433 | 7.79 | 53500 | 2.5922 |
1.851 | 7.86 | 54000 | 2.5993 |
1.8313 | 7.93 | 54500 | 2.5960 |
1.8298 | 8.0 | 55000 | 2.6058 |
1.8159 | 8.08 | 55500 | 2.6286 |
1.817 | 8.15 | 56000 | 2.6348 |
1.8066 | 8.22 | 56500 | 2.6411 |
1.7935 | 8.3 | 57000 | 2.6338 |
1.809 | 8.37 | 57500 | 2.6290 |
1.812 | 8.44 | 58000 | 2.6258 |
1.79 | 8.51 | 58500 | 2.6321 |
1.8046 | 8.59 | 59000 | 2.6291 |
1.7975 | 8.66 | 59500 | 2.6283 |
1.7968 | 8.73 | 60000 | 2.6284 |
1.7779 | 8.81 | 60500 | 2.6257 |
1.7664 | 8.88 | 61000 | 2.6232 |
1.792 | 8.95 | 61500 | 2.6305 |
1.7725 | 9.02 | 62000 | 2.6525 |
1.7563 | 9.1 | 62500 | 2.6794 |
1.7606 | 9.17 | 63000 | 2.6784 |
1.7666 | 9.24 | 63500 | 2.6798 |
1.7551 | 9.31 | 64000 | 2.6813 |
1.7578 | 9.39 | 64500 | 2.6830 |
1.7483 | 9.46 | 65000 | 2.6833 |
1.7431 | 9.53 | 65500 | 2.6884 |
1.743 | 9.61 | 66000 | 2.6932 |
1.7395 | 9.68 | 66500 | 2.6927 |
1.7473 | 9.75 | 67000 | 2.6904 |
1.7413 | 9.82 | 67500 | 2.6892 |
1.7437 | 9.9 | 68000 | 2.6898 |
1.7546 | 9.97 | 68500 | 2.6894 |
Framework versions
- Transformers 4.21.0.dev0
- Pytorch 1.12.0+cu102
- Datasets 2.3.2
- Tokenizers 0.12.1
- Downloads last month
- 19
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.