--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer model-index: - name: QA_using_DistilBERT_LORA_qv results: [] --- # QA_using_DistilBERT_LORA_qv This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.7782 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 4.7211 | 0.01 | 500 | 4.4152 | | 4.3014 | 0.03 | 1000 | 4.3057 | | 4.2235 | 0.04 | 1500 | 4.2254 | | 4.1424 | 0.06 | 2000 | 4.1592 | | 4.1312 | 0.07 | 2500 | 4.1091 | | 4.033 | 0.09 | 3000 | 3.8720 | | 3.739 | 0.1 | 3500 | 3.7028 | | 3.6547 | 0.12 | 4000 | 3.5784 | | 3.4915 | 0.13 | 4500 | 3.4967 | | 3.5266 | 0.15 | 5000 | 3.4501 | | 3.4602 | 0.16 | 5500 | 3.5048 | | 3.4749 | 0.18 | 6000 | 3.3635 | | 3.4088 | 0.19 | 6500 | 3.3465 | | 3.3869 | 0.21 | 7000 | 3.3438 | | 3.3835 | 0.22 | 7500 | 3.2838 | | 3.2902 | 0.23 | 8000 | 3.3156 | | 3.2747 | 0.25 | 8500 | 3.2770 | | 3.2968 | 0.26 | 9000 | 3.2578 | | 3.2305 | 0.28 | 9500 | 3.2645 | | 3.2288 | 0.29 | 10000 | 3.1857 | | 3.2717 | 0.31 | 10500 | 3.2326 | | 3.1697 | 0.32 | 11000 | 3.2098 | | 3.1786 | 0.34 | 11500 | 3.2656 | | 3.2063 | 0.35 | 12000 | 3.1725 | | 3.186 | 0.37 | 12500 | 3.1901 | | 3.1389 | 0.38 | 13000 | 3.1706 | | 3.234 | 0.4 | 13500 | 3.1553 | | 3.1207 | 0.41 | 14000 | 3.1764 | | 3.1764 | 0.42 | 14500 | 3.1441 | | 3.1458 | 0.44 | 15000 | 3.1459 | | 3.0631 | 0.45 | 15500 | 3.1461 | | 3.1193 | 0.47 | 16000 | 3.1306 | | 3.0437 | 0.48 | 16500 | 3.1775 | | 3.1309 | 0.5 | 17000 | 3.0853 | | 3.0448 | 0.51 | 17500 | 3.1136 | | 3.0273 | 0.53 | 18000 | 3.0640 | | 3.0826 | 0.54 | 18500 | 3.0786 | | 3.0044 | 0.56 | 19000 | 3.0843 | | 3.0672 | 0.57 | 19500 | 3.0516 | | 3.0447 | 0.59 | 20000 | 3.0581 | | 3.0168 | 0.6 | 20500 | 3.0369 | | 2.9619 | 0.62 | 21000 | 3.0725 | | 3.0981 | 0.63 | 21500 | 3.0389 | | 3.0247 | 0.64 | 22000 | 3.0339 | | 3.041 | 0.66 | 22500 | 3.0465 | | 3.0286 | 0.67 | 23000 | 3.0806 | | 3.0136 | 0.69 | 23500 | 3.0149 | | 2.9814 | 0.7 | 24000 | 3.0128 | | 3.0359 | 0.72 | 24500 | 3.0086 | | 2.9939 | 0.73 | 25000 | 3.0216 | | 2.996 | 0.75 | 25500 | 3.1415 | | 2.9554 | 0.76 | 26000 | 3.0490 | | 2.9773 | 0.78 | 26500 | 3.0457 | | 2.9625 | 0.79 | 27000 | 2.9663 | | 2.9184 | 0.81 | 27500 | 2.9981 | | 2.9735 | 0.82 | 28000 | 3.0404 | | 2.9567 | 0.84 | 28500 | 2.9621 | | 2.9706 | 0.85 | 29000 | 3.0024 | | 2.9436 | 0.86 | 29500 | 2.9535 | | 2.9069 | 0.88 | 30000 | 2.9993 | | 2.9652 | 0.89 | 30500 | 2.9393 | | 2.9426 | 0.91 | 31000 | 2.9693 | | 2.8936 | 0.92 | 31500 | 2.9111 | | 2.9245 | 0.94 | 32000 | 2.9678 | | 2.9054 | 0.95 | 32500 | 2.9263 | | 2.8426 | 0.97 | 33000 | 2.9429 | | 2.8782 | 0.98 | 33500 | 2.9232 | | 2.8963 | 1.0 | 34000 | 2.9545 | | 2.8757 | 1.01 | 34500 | 2.9181 | | 2.853 | 1.03 | 35000 | 2.8925 | | 2.8758 | 1.04 | 35500 | 2.9464 | | 2.9179 | 1.06 | 36000 | 2.9076 | | 2.8924 | 1.07 | 36500 | 2.8874 | | 2.9488 | 1.08 | 37000 | 2.9284 | | 2.8746 | 1.1 | 37500 | 2.9012 | | 2.8026 | 1.11 | 38000 | 2.8679 | | 2.8177 | 1.13 | 38500 | 2.9000 | | 2.8113 | 1.14 | 39000 | 2.9069 | | 2.8047 | 1.16 | 39500 | 2.8755 | | 2.8437 | 1.17 | 40000 | 2.9043 | | 2.8093 | 1.19 | 40500 | 2.8915 | | 2.7881 | 1.2 | 41000 | 2.8665 | | 2.8251 | 1.22 | 41500 | 2.8516 | | 2.8356 | 1.23 | 42000 | 2.8927 | | 2.7805 | 1.25 | 42500 | 2.8759 | | 2.8944 | 1.26 | 43000 | 2.8491 | | 2.88 | 1.27 | 43500 | 2.8458 | | 2.8109 | 1.29 | 44000 | 2.8613 | | 2.7595 | 1.3 | 44500 | 2.8734 | | 2.8038 | 1.32 | 45000 | 2.8344 | | 2.8113 | 1.33 | 45500 | 2.8448 | | 2.8396 | 1.35 | 46000 | 2.8216 | | 2.833 | 1.36 | 46500 | 2.8445 | | 2.7711 | 1.38 | 47000 | 2.8499 | | 2.7933 | 1.39 | 47500 | 2.8649 | | 2.8079 | 1.41 | 48000 | 2.8390 | | 2.781 | 1.42 | 48500 | 2.7999 | | 2.8195 | 1.44 | 49000 | 2.8320 | | 2.7553 | 1.45 | 49500 | 2.8500 | | 2.7769 | 1.47 | 50000 | 2.8364 | | 2.6745 | 1.48 | 50500 | 2.8392 | | 2.7891 | 1.49 | 51000 | 2.8166 | | 2.7691 | 1.51 | 51500 | 2.8195 | | 2.7744 | 1.52 | 52000 | 2.8505 | | 2.739 | 1.54 | 52500 | 2.8055 | | 2.7843 | 1.55 | 53000 | 2.8633 | | 2.7072 | 1.57 | 53500 | 2.8214 | | 2.7658 | 1.58 | 54000 | 2.8178 | | 2.7271 | 1.6 | 54500 | 2.8075 | | 2.8387 | 1.61 | 55000 | 2.8025 | | 2.7425 | 1.63 | 55500 | 2.8061 | | 2.7464 | 1.64 | 56000 | 2.7882 | | 2.7442 | 1.66 | 56500 | 2.8161 | | 2.7398 | 1.67 | 57000 | 2.8091 | | 2.7081 | 1.69 | 57500 | 2.8166 | | 2.759 | 1.7 | 58000 | 2.8014 | | 2.6873 | 1.71 | 58500 | 2.7949 | | 2.8057 | 1.73 | 59000 | 2.8044 | | 2.8156 | 1.74 | 59500 | 2.7860 | | 2.6884 | 1.76 | 60000 | 2.7931 | | 2.7627 | 1.77 | 60500 | 2.7931 | | 2.6991 | 1.79 | 61000 | 2.7895 | | 2.8059 | 1.8 | 61500 | 2.7981 | | 2.7018 | 1.82 | 62000 | 2.7972 | | 2.7027 | 1.83 | 62500 | 2.7956 | | 2.7658 | 1.85 | 63000 | 2.7949 | | 2.7735 | 1.86 | 63500 | 2.7803 | | 2.6972 | 1.88 | 64000 | 2.7894 | | 2.6512 | 1.89 | 64500 | 2.8087 | | 2.6856 | 1.9 | 65000 | 2.7795 | | 2.7292 | 1.92 | 65500 | 2.7772 | | 2.7744 | 1.93 | 66000 | 2.7821 | | 2.8022 | 1.95 | 66500 | 2.7858 | | 2.7054 | 1.96 | 67000 | 2.7816 | | 2.7255 | 1.98 | 67500 | 2.7740 | | 2.6243 | 1.99 | 68000 | 2.7782 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0