Edit model card

bert-30M-uncased-classification-CMC-fqa-new

This model is a fine-tuned version of vietgpt/bert-30M-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7760
  • Accuracy: 0.9677

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 20 3.4306 0.0323
No log 2.0 40 3.4143 0.0323
No log 3.0 60 3.4026 0.0645
No log 4.0 80 3.3888 0.2258
No log 5.0 100 3.3725 0.2581
No log 6.0 120 3.3523 0.3548
No log 7.0 140 3.3244 0.4194
No log 8.0 160 3.2797 0.3871
No log 9.0 180 3.2072 0.5161
No log 10.0 200 3.0977 0.4839
No log 11.0 220 2.9538 0.2903
No log 12.0 240 2.8136 0.2903
No log 13.0 260 2.6977 0.3871
No log 14.0 280 2.5970 0.4839
No log 15.0 300 2.5041 0.5806
No log 16.0 320 2.4092 0.5484
No log 17.0 340 2.3064 0.6774
No log 18.0 360 2.2057 0.6774
No log 19.0 380 2.0945 0.7419
No log 20.0 400 1.9827 0.7742
No log 21.0 420 1.8641 0.7742
No log 22.0 440 1.7476 0.7742
No log 23.0 460 1.6518 0.8065
No log 24.0 480 1.5613 0.8065
2.7559 25.0 500 1.4894 0.8387
2.7559 26.0 520 1.4089 0.8387
2.7559 27.0 540 1.3390 0.8065
2.7559 28.0 560 1.2802 0.8710
2.7559 29.0 580 1.2265 0.8710
2.7559 30.0 600 1.1639 0.8387
2.7559 31.0 620 1.1253 0.8710
2.7559 32.0 640 1.0845 0.9032
2.7559 33.0 660 1.0468 0.9032
2.7559 34.0 680 1.0144 0.9032
2.7559 35.0 700 0.9805 0.9355
2.7559 36.0 720 0.9564 0.9355
2.7559 37.0 740 0.9237 0.9677
2.7559 38.0 760 0.9041 0.9355
2.7559 39.0 780 0.8815 0.9677
2.7559 40.0 800 0.8668 0.9677
2.7559 41.0 820 0.8486 0.9677
2.7559 42.0 840 0.8288 0.9677
2.7559 43.0 860 0.8174 0.9677
2.7559 44.0 880 0.8058 0.9677
2.7559 45.0 900 0.7978 0.9677
2.7559 46.0 920 0.7901 0.9677
2.7559 47.0 940 0.7842 0.9677
2.7559 48.0 960 0.7798 0.9677
2.7559 49.0 980 0.7769 0.9677
1.1031 50.0 1000 0.7760 0.9677

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
4
Safetensors
Model size
34.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for thangvip/bert-30M-uncased-classification-CMC-fqa-new

Finetuned
(6)
this model