final_roberta_with_new_400k_plus_37k
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2668
- Accuracy: 0.9031
- F1: 0.9027
- Precision: 0.9042
- Recall: 0.9031
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
0.3174 | 0.01 | 100 | 0.3254 | 0.8928 | 0.8919 | 0.8964 | 0.8928 |
0.3285 | 0.01 | 200 | 0.2578 | 0.8956 | 0.8951 | 0.8968 | 0.8956 |
0.247 | 0.02 | 300 | 0.3913 | 0.8621 | 0.8588 | 0.8783 | 0.8621 |
0.2853 | 0.03 | 400 | 0.3394 | 0.8737 | 0.8711 | 0.8871 | 0.8737 |
0.3031 | 0.04 | 500 | 0.3924 | 0.8537 | 0.8491 | 0.8770 | 0.8537 |
0.2747 | 0.04 | 600 | 0.2532 | 0.9079 | 0.9079 | 0.9080 | 0.9079 |
0.2797 | 0.05 | 700 | 0.3607 | 0.8637 | 0.8607 | 0.8781 | 0.8637 |
0.2211 | 0.06 | 800 | 0.2910 | 0.8880 | 0.8872 | 0.8909 | 0.8880 |
0.2769 | 0.06 | 900 | 0.2834 | 0.8824 | 0.8810 | 0.8884 | 0.8824 |
0.2412 | 0.07 | 1000 | 0.2394 | 0.9063 | 0.9061 | 0.9069 | 0.9063 |
0.3386 | 0.08 | 1100 | 0.2400 | 0.9016 | 0.9013 | 0.9020 | 0.9016 |
0.2743 | 0.09 | 1200 | 0.2421 | 0.9047 | 0.9048 | 0.9048 | 0.9047 |
0.2682 | 0.09 | 1300 | 0.2833 | 0.8768 | 0.8752 | 0.8839 | 0.8768 |
0.3219 | 0.1 | 1400 | 0.2383 | 0.9071 | 0.9070 | 0.9071 | 0.9071 |
0.2211 | 0.11 | 1500 | 0.2454 | 0.9047 | 0.9047 | 0.9047 | 0.9047 |
0.2606 | 0.11 | 1600 | 0.2083 | 0.9223 | 0.9221 | 0.9228 | 0.9223 |
0.1966 | 0.12 | 1700 | 0.2688 | 0.9004 | 0.9001 | 0.9007 | 0.9004 |
0.2205 | 0.13 | 1800 | 0.3076 | 0.8776 | 0.8752 | 0.8911 | 0.8776 |
0.2242 | 0.14 | 1900 | 0.2171 | 0.9151 | 0.9150 | 0.9153 | 0.9151 |
0.257 | 0.14 | 2000 | 0.2643 | 0.8912 | 0.8905 | 0.8932 | 0.8912 |
0.2238 | 0.15 | 2100 | 0.2165 | 0.9131 | 0.9128 | 0.9141 | 0.9131 |
0.2313 | 0.16 | 2200 | 0.2312 | 0.8996 | 0.8996 | 0.8996 | 0.8996 |
0.1856 | 0.16 | 2300 | 0.2269 | 0.9107 | 0.9108 | 0.9109 | 0.9107 |
0.2201 | 0.17 | 2400 | 0.2425 | 0.9059 | 0.9056 | 0.9065 | 0.9059 |
0.3332 | 0.18 | 2500 | 0.2254 | 0.9043 | 0.9044 | 0.9048 | 0.9043 |
0.1843 | 0.19 | 2600 | 0.2524 | 0.8980 | 0.8971 | 0.9020 | 0.8980 |
0.2728 | 0.19 | 2700 | 0.2348 | 0.8968 | 0.8957 | 0.9017 | 0.8968 |
0.2131 | 0.2 | 2800 | 0.2210 | 0.9135 | 0.9136 | 0.9138 | 0.9135 |
0.19 | 0.21 | 2900 | 0.2259 | 0.9123 | 0.9120 | 0.9130 | 0.9123 |
0.2099 | 0.21 | 3000 | 0.2814 | 0.9024 | 0.9016 | 0.9054 | 0.9024 |
0.2209 | 0.22 | 3100 | 0.2473 | 0.9051 | 0.9046 | 0.9070 | 0.9051 |
0.2366 | 0.23 | 3200 | 0.2561 | 0.8992 | 0.8983 | 0.9029 | 0.8992 |
0.3156 | 0.24 | 3300 | 0.2192 | 0.9095 | 0.9094 | 0.9095 | 0.9095 |
0.197 | 0.24 | 3400 | 0.2382 | 0.9063 | 0.9057 | 0.9093 | 0.9063 |
0.2371 | 0.25 | 3500 | 0.2243 | 0.9139 | 0.9141 | 0.9166 | 0.9139 |
0.2273 | 0.26 | 3600 | 0.2362 | 0.9135 | 0.9131 | 0.9153 | 0.9135 |
0.2504 | 0.26 | 3700 | 0.2671 | 0.8888 | 0.8873 | 0.8965 | 0.8888 |
0.1978 | 0.27 | 3800 | 0.2049 | 0.9171 | 0.9170 | 0.9172 | 0.9171 |
0.2189 | 0.28 | 3900 | 0.2268 | 0.9099 | 0.9099 | 0.9099 | 0.9099 |
0.2171 | 0.29 | 4000 | 0.2135 | 0.9163 | 0.9162 | 0.9164 | 0.9163 |
0.2325 | 0.29 | 4100 | 0.2624 | 0.8916 | 0.8905 | 0.8966 | 0.8916 |
0.1888 | 0.3 | 4200 | 0.2878 | 0.8924 | 0.8911 | 0.8987 | 0.8924 |
0.2345 | 0.31 | 4300 | 0.2444 | 0.8964 | 0.8953 | 0.9013 | 0.8964 |
0.1688 | 0.31 | 4400 | 0.2479 | 0.9083 | 0.9077 | 0.9109 | 0.9083 |
0.2083 | 0.32 | 4500 | 0.2200 | 0.9135 | 0.9131 | 0.9150 | 0.9135 |
0.2475 | 0.33 | 4600 | 0.2353 | 0.9035 | 0.9030 | 0.9052 | 0.9035 |
0.1928 | 0.34 | 4700 | 0.2987 | 0.8944 | 0.8933 | 0.8992 | 0.8944 |
0.2008 | 0.34 | 4800 | 0.2993 | 0.8760 | 0.8735 | 0.8897 | 0.8760 |
0.22 | 0.35 | 4900 | 0.2431 | 0.9035 | 0.9033 | 0.9039 | 0.9035 |
0.1844 | 0.36 | 5000 | 0.2590 | 0.9171 | 0.9171 | 0.9171 | 0.9171 |
0.2235 | 0.36 | 5100 | 0.2421 | 0.9047 | 0.9041 | 0.9072 | 0.9047 |
0.2222 | 0.37 | 5200 | 0.2958 | 0.8948 | 0.8941 | 0.8973 | 0.8948 |
0.2241 | 0.38 | 5300 | 0.2031 | 0.9211 | 0.9209 | 0.9216 | 0.9211 |
0.2307 | 0.39 | 5400 | 0.2277 | 0.9043 | 0.9036 | 0.9076 | 0.9043 |
0.1926 | 0.39 | 5500 | 0.2817 | 0.8900 | 0.8887 | 0.8959 | 0.8900 |
0.2119 | 0.4 | 5600 | 0.2151 | 0.9175 | 0.9174 | 0.9176 | 0.9175 |
0.1747 | 0.41 | 5700 | 0.2404 | 0.9123 | 0.9121 | 0.9126 | 0.9123 |
0.1809 | 0.41 | 5800 | 0.3013 | 0.8920 | 0.8908 | 0.8980 | 0.8920 |
0.1748 | 0.42 | 5900 | 0.3084 | 0.9063 | 0.9056 | 0.9097 | 0.9063 |
0.2101 | 0.43 | 6000 | 0.2129 | 0.9175 | 0.9173 | 0.9180 | 0.9175 |
0.202 | 0.44 | 6100 | 0.3794 | 0.8848 | 0.8834 | 0.8914 | 0.8848 |
0.1671 | 0.44 | 6200 | 0.2678 | 0.9043 | 0.9041 | 0.9046 | 0.9043 |
0.2808 | 0.45 | 6300 | 0.2613 | 0.9075 | 0.9070 | 0.9098 | 0.9075 |
0.2853 | 0.46 | 6400 | 0.2270 | 0.9087 | 0.9088 | 0.9088 | 0.9087 |
0.187 | 0.46 | 6500 | 0.2400 | 0.9111 | 0.9112 | 0.9115 | 0.9111 |
0.1382 | 0.47 | 6600 | 0.2454 | 0.9139 | 0.9136 | 0.9146 | 0.9139 |
0.2259 | 0.48 | 6700 | 0.3165 | 0.8904 | 0.8890 | 0.8976 | 0.8904 |
0.164 | 0.49 | 6800 | 0.3091 | 0.9031 | 0.9023 | 0.9074 | 0.9031 |
0.2557 | 0.49 | 6900 | 0.2708 | 0.9024 | 0.9015 | 0.9064 | 0.9024 |
0.1586 | 0.5 | 7000 | 0.2139 | 0.9247 | 0.9246 | 0.9247 | 0.9247 |
0.2391 | 0.51 | 7100 | 0.2087 | 0.9143 | 0.9141 | 0.9147 | 0.9143 |
0.1974 | 0.51 | 7200 | 0.2438 | 0.9171 | 0.9171 | 0.9171 | 0.9171 |
0.2507 | 0.52 | 7300 | 0.2323 | 0.9051 | 0.9044 | 0.9084 | 0.9051 |
0.226 | 0.53 | 7400 | 0.2465 | 0.9063 | 0.9056 | 0.9096 | 0.9063 |
0.1859 | 0.54 | 7500 | 0.2762 | 0.8960 | 0.8949 | 0.9012 | 0.8960 |
0.2208 | 0.54 | 7600 | 0.2705 | 0.8948 | 0.8937 | 0.9002 | 0.8948 |
0.2073 | 0.55 | 7700 | 0.2419 | 0.9008 | 0.9006 | 0.9007 | 0.9008 |
0.1557 | 0.56 | 7800 | 0.3004 | 0.8900 | 0.8899 | 0.8899 | 0.8900 |
0.1872 | 0.56 | 7900 | 0.2520 | 0.9059 | 0.9056 | 0.9066 | 0.9059 |
0.1749 | 0.57 | 8000 | 0.2757 | 0.9067 | 0.9067 | 0.9067 | 0.9067 |
0.2298 | 0.58 | 8100 | 0.2617 | 0.9075 | 0.9071 | 0.9092 | 0.9075 |
0.1781 | 0.59 | 8200 | 0.2380 | 0.9139 | 0.9137 | 0.9144 | 0.9139 |
0.1448 | 0.59 | 8300 | 0.2668 | 0.9031 | 0.9027 | 0.9042 | 0.9031 |
Framework versions
- Transformers 4.39.3
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.