gc394 commited on
Commit
6d9a273
·
verified ·
1 Parent(s): f7e8df4

End of training

Browse files
Files changed (1) hide show
  1. README.md +11 -8
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 1.6983
19
 
20
  ## Model description
21
 
@@ -40,23 +40,26 @@ The following hyperparameters were used during training:
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 5
44
  - mixed_precision_training: Native AMP
45
 
46
  ### Training results
47
 
48
  | Training Loss | Epoch | Step | Validation Loss |
49
  |:-------------:|:-----:|:----:|:---------------:|
50
- | No log | 1.0 | 130 | 1.9611 |
51
- | No log | 2.0 | 260 | 1.8001 |
52
- | No log | 3.0 | 390 | 1.7215 |
53
- | 1.9905 | 4.0 | 520 | 1.7342 |
54
- | 1.9905 | 5.0 | 650 | 1.6983 |
 
 
 
55
 
56
 
57
  ### Framework versions
58
 
59
- - Transformers 4.40.0
60
  - Pytorch 2.2.1+cu121
61
  - Datasets 2.19.0
62
  - Tokenizers 0.19.1
 
15
 
16
  This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 1.5870
19
 
20
  ## Model description
21
 
 
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
+ - num_epochs: 8
44
  - mixed_precision_training: Native AMP
45
 
46
  ### Training results
47
 
48
  | Training Loss | Epoch | Step | Validation Loss |
49
  |:-------------:|:-----:|:----:|:---------------:|
50
+ | No log | 1.0 | 130 | 1.9587 |
51
+ | No log | 2.0 | 260 | 1.7889 |
52
+ | No log | 3.0 | 390 | 1.7015 |
53
+ | 1.9762 | 4.0 | 520 | 1.6974 |
54
+ | 1.9762 | 5.0 | 650 | 1.6412 |
55
+ | 1.9762 | 6.0 | 780 | 1.6209 |
56
+ | 1.9762 | 7.0 | 910 | 1.6294 |
57
+ | 1.6723 | 8.0 | 1040 | 1.5870 |
58
 
59
 
60
  ### Framework versions
61
 
62
+ - Transformers 4.40.1
63
  - Pytorch 2.2.1+cu121
64
  - Datasets 2.19.0
65
  - Tokenizers 0.19.1