--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: DIALOGUE_one_ results: [] --- # DIALOGUE_one_ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2117 - Precision: 0.9762 - Recall: 0.9737 - F1: 0.9736 - Accuracy: 0.9737 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.9949 | 0.62 | 30 | 0.4697 | 0.9659 | 0.9605 | 0.9603 | 0.9605 | | 0.3831 | 1.25 | 60 | 0.1338 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.1135 | 1.88 | 90 | 0.1407 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0256 | 2.5 | 120 | 0.1359 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0126 | 3.12 | 150 | 0.1449 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0227 | 3.75 | 180 | 0.1552 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0051 | 4.38 | 210 | 0.1573 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0037 | 5.0 | 240 | 0.1594 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.003 | 5.62 | 270 | 0.1626 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0024 | 6.25 | 300 | 0.1645 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0021 | 6.88 | 330 | 0.1737 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0018 | 7.5 | 360 | 0.1759 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0015 | 8.12 | 390 | 0.1774 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0014 | 8.75 | 420 | 0.1801 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0013 | 9.38 | 450 | 0.1837 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0011 | 10.0 | 480 | 0.1852 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.001 | 10.62 | 510 | 0.1878 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0009 | 11.25 | 540 | 0.1892 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0009 | 11.88 | 570 | 0.1939 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0008 | 12.5 | 600 | 0.1948 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0008 | 13.12 | 630 | 0.1961 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0007 | 13.75 | 660 | 0.1965 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0007 | 14.38 | 690 | 0.1972 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0006 | 15.0 | 720 | 0.1988 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0006 | 15.62 | 750 | 0.1993 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0005 | 16.25 | 780 | 0.2006 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0005 | 16.88 | 810 | 0.2020 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0005 | 17.5 | 840 | 0.2031 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0005 | 18.12 | 870 | 0.2045 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0005 | 18.75 | 900 | 0.2054 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 19.38 | 930 | 0.2051 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 20.0 | 960 | 0.2053 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 20.62 | 990 | 0.2058 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 21.25 | 1020 | 0.2069 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 21.88 | 1050 | 0.2076 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 22.5 | 1080 | 0.2079 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 23.12 | 1110 | 0.2084 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 23.75 | 1140 | 0.2092 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 24.38 | 1170 | 0.2095 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 25.0 | 1200 | 0.2100 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 25.62 | 1230 | 0.2104 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0003 | 26.25 | 1260 | 0.2109 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0003 | 26.88 | 1290 | 0.2111 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0003 | 27.5 | 1320 | 0.2113 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0003 | 28.12 | 1350 | 0.2115 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0003 | 28.75 | 1380 | 0.2116 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0003 | 29.38 | 1410 | 0.2117 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | | 0.0004 | 30.0 | 1440 | 0.2117 | 0.9762 | 0.9737 | 0.9736 | 0.9737 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0