--- license: apache-2.0 tags: - generated_from_trainer base_model: distilbert/distilbert-base-cased metrics: - accuracy - precision - recall model-index: - name: case-analysis-distilbert-base-cased results: [] --- ## Metrics - loss: 1.8402 - accuracy: 0.8085 - precision: 0.7983 - recall: 0.8085 - precision_macro: 0.6608 - recall_macro: 0.6429 - macro_fpr: 0.0935 - weighted_fpr: 0.0732 - weighted_specificity: 0.8548 - macro_specificity: 0.9158 - weighted_sensitivity: 0.8085 - macro_sensitivity: 0.6429 - f1_micro: 0.8085 - f1_macro: 0.6478 - f1_weighted: 0.8018 - runtime: 131.6318 - samples_per_second: 3.4110 - steps_per_second: 0.4330 # case-analysis-distilbert-base-cased This model is a fine-tuned version of [distilbert/distilbert-base-cased](https://huggingface.co/distilbert/distilbert-base-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.8402 - Accuracy: 0.8085 - Precision: 0.7983 - Recall: 0.8085 - Precision Macro: 0.6461 - Recall Macro: 0.6218 - Macro Fpr: 0.0984 - Weighted Fpr: 0.0771 - Weighted Specificity: 0.8479 - Macro Specificity: 0.9119 - Weighted Sensitivity: 0.7996 - Macro Sensitivity: 0.6218 - F1 Micro: 0.7996 - F1 Macro: 0.6245 - F1 Weighted: 0.7887 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:| | No log | 1.0 | 224 | 0.7001 | 0.7661 | 0.7311 | 0.7661 | 0.5791 | 0.5137 | 0.1330 | 0.0923 | 0.7614 | 0.8819 | 0.7661 | 0.5137 | 0.7661 | 0.5270 | 0.7333 | | No log | 2.0 | 448 | 0.7388 | 0.7751 | 0.7315 | 0.7751 | 0.5585 | 0.5464 | 0.1208 | 0.0882 | 0.7908 | 0.8915 | 0.7751 | 0.5464 | 0.7751 | 0.5487 | 0.7493 | | 0.7066 | 3.0 | 672 | 0.7229 | 0.8018 | 0.7605 | 0.8018 | 0.5932 | 0.5708 | 0.1076 | 0.0761 | 0.8090 | 0.9027 | 0.8018 | 0.5708 | 0.8018 | 0.5767 | 0.7760 | | 0.7066 | 4.0 | 896 | 0.8331 | 0.8062 | 0.7896 | 0.8062 | 0.6675 | 0.6115 | 0.1018 | 0.0742 | 0.8218 | 0.9070 | 0.8062 | 0.6115 | 0.8062 | 0.6301 | 0.7934 | | 0.3654 | 5.0 | 1120 | 1.2300 | 0.7684 | 0.7699 | 0.7684 | 0.6085 | 0.6131 | 0.1066 | 0.0913 | 0.8542 | 0.9056 | 0.7684 | 0.6131 | 0.7684 | 0.5896 | 0.7611 | | 0.3654 | 6.0 | 1344 | 1.0698 | 0.8129 | 0.7940 | 0.8129 | 0.6864 | 0.6153 | 0.0957 | 0.0712 | 0.8406 | 0.9134 | 0.8129 | 0.6153 | 0.8129 | 0.6300 | 0.7972 | | 0.2047 | 7.0 | 1568 | 1.3300 | 0.7884 | 0.7960 | 0.7884 | 0.6412 | 0.5959 | 0.1044 | 0.0821 | 0.8421 | 0.9076 | 0.7884 | 0.5959 | 0.7884 | 0.6141 | 0.7892 | | 0.2047 | 8.0 | 1792 | 1.3870 | 0.8107 | 0.7861 | 0.8107 | 0.6467 | 0.6063 | 0.0983 | 0.0722 | 0.8318 | 0.9106 | 0.8107 | 0.6063 | 0.8107 | 0.6163 | 0.7947 | | 0.0795 | 9.0 | 2016 | 1.5031 | 0.7951 | 0.7719 | 0.7951 | 0.6275 | 0.5969 | 0.1040 | 0.0791 | 0.8320 | 0.9068 | 0.7951 | 0.5969 | 0.7951 | 0.6036 | 0.7803 | | 0.0795 | 10.0 | 2240 | 1.6304 | 0.7728 | 0.7796 | 0.7728 | 0.6171 | 0.6233 | 0.1060 | 0.0892 | 0.8561 | 0.9072 | 0.7728 | 0.6233 | 0.7728 | 0.6196 | 0.7759 | | 0.0795 | 11.0 | 2464 | 1.6553 | 0.8040 | 0.7802 | 0.8040 | 0.6405 | 0.6047 | 0.1003 | 0.0751 | 0.8333 | 0.9093 | 0.8040 | 0.6047 | 0.8040 | 0.6097 | 0.7884 | | 0.0309 | 12.0 | 2688 | 1.6668 | 0.7996 | 0.7776 | 0.7996 | 0.6247 | 0.6084 | 0.0999 | 0.0771 | 0.8431 | 0.9107 | 0.7996 | 0.6084 | 0.7996 | 0.6073 | 0.7861 | | 0.0309 | 13.0 | 2912 | 1.7548 | 0.8040 | 0.7724 | 0.8040 | 0.6059 | 0.5847 | 0.1030 | 0.0751 | 0.8216 | 0.9064 | 0.8040 | 0.5847 | 0.8040 | 0.5912 | 0.7846 | | 0.0225 | 14.0 | 3136 | 1.6691 | 0.8107 | 0.7736 | 0.8107 | 0.5965 | 0.6044 | 0.0974 | 0.0722 | 0.8336 | 0.9111 | 0.8107 | 0.6044 | 0.8107 | 0.5998 | 0.7909 | | 0.0225 | 15.0 | 3360 | 1.8751 | 0.8040 | 0.7897 | 0.8040 | 0.6516 | 0.6081 | 0.1007 | 0.0751 | 0.8322 | 0.9091 | 0.8040 | 0.6081 | 0.8040 | 0.6251 | 0.7939 | | 0.0048 | 16.0 | 3584 | 1.8402 | 0.8085 | 0.7983 | 0.8085 | 0.6608 | 0.6429 | 0.0935 | 0.0732 | 0.8548 | 0.9158 | 0.8085 | 0.6429 | 0.8085 | 0.6478 | 0.8018 | | 0.0048 | 17.0 | 3808 | 1.9124 | 0.7951 | 0.7871 | 0.7951 | 0.6331 | 0.6237 | 0.1001 | 0.0791 | 0.8456 | 0.9102 | 0.7951 | 0.6237 | 0.7951 | 0.6250 | 0.7891 | | 0.0069 | 18.0 | 4032 | 1.8857 | 0.7973 | 0.7794 | 0.7973 | 0.6268 | 0.5972 | 0.1048 | 0.0781 | 0.8240 | 0.9053 | 0.7973 | 0.5972 | 0.7973 | 0.6062 | 0.7847 | | 0.0069 | 19.0 | 4256 | 1.9492 | 0.8062 | 0.7813 | 0.8062 | 0.6467 | 0.6015 | 0.1006 | 0.0742 | 0.8281 | 0.9086 | 0.8062 | 0.6015 | 0.8062 | 0.6107 | 0.7895 | | 0.0069 | 20.0 | 4480 | 1.8994 | 0.8085 | 0.7849 | 0.8085 | 0.6417 | 0.6067 | 0.0988 | 0.0732 | 0.8322 | 0.9102 | 0.8085 | 0.6067 | 0.8085 | 0.6144 | 0.7932 | | 0.0034 | 21.0 | 4704 | 1.9819 | 0.8040 | 0.7898 | 0.8040 | 0.6748 | 0.6325 | 0.0976 | 0.0751 | 0.8439 | 0.9120 | 0.8040 | 0.6325 | 0.8040 | 0.6429 | 0.7942 | | 0.0034 | 22.0 | 4928 | 2.0181 | 0.8062 | 0.7880 | 0.8062 | 0.6736 | 0.6204 | 0.0977 | 0.0742 | 0.8408 | 0.9118 | 0.8062 | 0.6204 | 0.8062 | 0.6293 | 0.7930 | | 0.0001 | 23.0 | 5152 | 2.0305 | 0.8062 | 0.7880 | 0.8062 | 0.6736 | 0.6204 | 0.0977 | 0.0742 | 0.8408 | 0.9118 | 0.8062 | 0.6204 | 0.8062 | 0.6293 | 0.7930 | | 0.0001 | 24.0 | 5376 | 2.0249 | 0.8040 | 0.7801 | 0.8040 | 0.6448 | 0.6004 | 0.1019 | 0.0751 | 0.8256 | 0.9074 | 0.8040 | 0.6004 | 0.8040 | 0.6092 | 0.7877 | | 0.0 | 25.0 | 5600 | 2.0139 | 0.8018 | 0.7848 | 0.8018 | 0.6514 | 0.6226 | 0.0984 | 0.0761 | 0.8438 | 0.9114 | 0.8018 | 0.6226 | 0.8018 | 0.6272 | 0.7908 | | 0.0 | 26.0 | 5824 | 2.0075 | 0.8040 | 0.7868 | 0.8040 | 0.6586 | 0.6281 | 0.0961 | 0.0751 | 0.8487 | 0.9132 | 0.8040 | 0.6281 | 0.8040 | 0.6305 | 0.7926 | | 0.0026 | 27.0 | 6048 | 2.0155 | 0.8040 | 0.7868 | 0.8040 | 0.6586 | 0.6281 | 0.0961 | 0.0751 | 0.8487 | 0.9132 | 0.8040 | 0.6281 | 0.8040 | 0.6305 | 0.7926 | | 0.0026 | 28.0 | 6272 | 2.0191 | 0.8040 | 0.7865 | 0.8040 | 0.6586 | 0.6237 | 0.0970 | 0.0751 | 0.8463 | 0.9126 | 0.8040 | 0.6237 | 0.8040 | 0.6283 | 0.7923 | | 0.0026 | 29.0 | 6496 | 2.0225 | 0.8040 | 0.7865 | 0.8040 | 0.6586 | 0.6237 | 0.0970 | 0.0751 | 0.8463 | 0.9126 | 0.8040 | 0.6237 | 0.8040 | 0.6283 | 0.7923 | | 0.0 | 30.0 | 6720 | 2.0343 | 0.7996 | 0.7821 | 0.7996 | 0.6461 | 0.6218 | 0.0984 | 0.0771 | 0.8479 | 0.9119 | 0.7996 | 0.6218 | 0.7996 | 0.6245 | 0.7887 | ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1