Neuria_BERT_Contexto
This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0980
- Accuracy: 0.8660
- Precision Micro: 0.9703
- Recall Micro: 0.8790
- F1 Micro: 0.9224
- F1 Macro: 0.7374
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 256
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision Micro | Recall Micro | F1 Micro | F1 Macro |
---|---|---|---|---|---|---|---|---|
0.5538 | 1.0 | 6 | 0.6032 | 0.0685 | 0.1679 | 0.1183 | 0.1388 | 0.0580 |
0.506 | 2.0 | 12 | 0.5156 | 0.0031 | 0.2 | 0.0027 | 0.0053 | 0.0041 |
0.4227 | 3.0 | 18 | 0.4170 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.3503 | 4.0 | 24 | 0.3627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.3087 | 5.0 | 30 | 0.3348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.2892 | 6.0 | 36 | 0.3220 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.277 | 7.0 | 42 | 0.3104 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.2664 | 8.0 | 48 | 0.2957 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.2525 | 9.0 | 54 | 0.2762 | 0.0872 | 1.0 | 0.1183 | 0.2115 | 0.0760 |
0.2383 | 10.0 | 60 | 0.2540 | 0.1900 | 0.9878 | 0.2177 | 0.3568 | 0.1589 |
0.2161 | 11.0 | 66 | 0.2363 | 0.2991 | 0.9034 | 0.3522 | 0.5068 | 0.2426 |
0.1976 | 12.0 | 72 | 0.2156 | 0.4299 | 0.9087 | 0.5081 | 0.6517 | 0.3581 |
0.1849 | 13.0 | 78 | 0.2039 | 0.5047 | 0.9030 | 0.5753 | 0.7028 | 0.4136 |
0.1686 | 14.0 | 84 | 0.1877 | 0.5732 | 0.8996 | 0.6505 | 0.7551 | 0.4510 |
0.1616 | 15.0 | 90 | 0.1746 | 0.6386 | 0.9446 | 0.6882 | 0.7963 | 0.4854 |
0.145 | 16.0 | 96 | 0.1651 | 0.6916 | 0.9477 | 0.7312 | 0.8255 | 0.5074 |
0.1403 | 17.0 | 102 | 0.1554 | 0.7259 | 0.9589 | 0.7527 | 0.8434 | 0.5398 |
0.1272 | 18.0 | 108 | 0.1457 | 0.7664 | 0.9670 | 0.7876 | 0.8681 | 0.5675 |
0.1188 | 19.0 | 114 | 0.1412 | 0.7477 | 0.9659 | 0.7608 | 0.8511 | 0.5605 |
0.1128 | 20.0 | 120 | 0.1325 | 0.7819 | 0.9612 | 0.7984 | 0.8722 | 0.6468 |
0.1065 | 21.0 | 126 | 0.1280 | 0.7975 | 0.9773 | 0.8118 | 0.8869 | 0.6576 |
0.101 | 22.0 | 132 | 0.1227 | 0.8037 | 0.9712 | 0.8172 | 0.8876 | 0.6556 |
0.0987 | 23.0 | 138 | 0.1201 | 0.8193 | 0.9780 | 0.8360 | 0.9014 | 0.7055 |
0.0948 | 24.0 | 144 | 0.1159 | 0.8193 | 0.975 | 0.8387 | 0.9017 | 0.7017 |
0.0905 | 25.0 | 150 | 0.1113 | 0.8411 | 0.9755 | 0.8548 | 0.9112 | 0.7149 |
0.0875 | 26.0 | 156 | 0.1106 | 0.8349 | 0.9753 | 0.8495 | 0.9080 | 0.7147 |
0.0869 | 27.0 | 162 | 0.1066 | 0.8380 | 0.9726 | 0.8575 | 0.9114 | 0.7221 |
0.0833 | 28.0 | 168 | 0.1058 | 0.8505 | 0.9787 | 0.8656 | 0.9187 | 0.7295 |
0.0827 | 29.0 | 174 | 0.1045 | 0.8536 | 0.9758 | 0.8683 | 0.9189 | 0.7332 |
0.0817 | 30.0 | 180 | 0.1030 | 0.8536 | 0.9729 | 0.8683 | 0.9176 | 0.7319 |
0.0806 | 31.0 | 186 | 0.1011 | 0.8598 | 0.9701 | 0.8737 | 0.9194 | 0.7351 |
0.0784 | 32.0 | 192 | 0.1013 | 0.8598 | 0.9731 | 0.8737 | 0.9207 | 0.7350 |
0.0778 | 33.0 | 198 | 0.1008 | 0.8598 | 0.9731 | 0.8737 | 0.9207 | 0.7357 |
0.0771 | 34.0 | 204 | 0.0991 | 0.8629 | 0.9731 | 0.8763 | 0.9222 | 0.7373 |
0.0763 | 35.0 | 210 | 0.0987 | 0.8598 | 0.9701 | 0.8737 | 0.9194 | 0.7351 |
0.0795 | 36.0 | 216 | 0.0985 | 0.8629 | 0.9731 | 0.8763 | 0.9222 | 0.7373 |
0.078 | 37.0 | 222 | 0.0986 | 0.8629 | 0.9731 | 0.8763 | 0.9222 | 0.7373 |
0.0761 | 38.0 | 228 | 0.0983 | 0.8629 | 0.9702 | 0.8763 | 0.9209 | 0.7366 |
0.0754 | 39.0 | 234 | 0.0982 | 0.8629 | 0.9702 | 0.8763 | 0.9209 | 0.7366 |
0.0759 | 40.0 | 240 | 0.0981 | 0.8660 | 0.9703 | 0.8790 | 0.9224 | 0.7374 |
0.0758 | 41.0 | 246 | 0.0981 | 0.8660 | 0.9703 | 0.8790 | 0.9224 | 0.7374 |
0.0877 | 41.7619 | 250 | 0.0980 | 0.8660 | 0.9703 | 0.8790 | 0.9224 | 0.7374 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.4.1
- Datasets 2.19.1
- Tokenizers 0.21.0
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for neuria99/Neuria_BERT_Contexto
Base model
dccuchile/bert-base-spanish-wwm-cased