simonycl commited on
Commit
563dd92
·
1 Parent(s): bd48a13

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +136 -0
README.md ADDED
@@ -0,0 +1,136 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: roberta-base
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: roberta-base-sst-2-16-13-smoothed
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # roberta-base-sst-2-16-13-smoothed
17
+
18
+ This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.5750
21
+ - Accuracy: 0.9688
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-05
41
+ - train_batch_size: 32
42
+ - eval_batch_size: 32
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_steps: 50
47
+ - num_epochs: 75
48
+ - label_smoothing_factor: 0.45
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
54
+ | No log | 1.0 | 1 | 0.6919 | 0.5 |
55
+ | No log | 2.0 | 2 | 0.6919 | 0.5 |
56
+ | No log | 3.0 | 3 | 0.6919 | 0.5 |
57
+ | No log | 4.0 | 4 | 0.6919 | 0.5 |
58
+ | No log | 5.0 | 5 | 0.6919 | 0.5 |
59
+ | No log | 6.0 | 6 | 0.6918 | 0.5 |
60
+ | No log | 7.0 | 7 | 0.6918 | 0.5 |
61
+ | No log | 8.0 | 8 | 0.6918 | 0.5 |
62
+ | No log | 9.0 | 9 | 0.6918 | 0.5 |
63
+ | 0.6949 | 10.0 | 10 | 0.6917 | 0.5 |
64
+ | 0.6949 | 11.0 | 11 | 0.6917 | 0.5 |
65
+ | 0.6949 | 12.0 | 12 | 0.6916 | 0.5 |
66
+ | 0.6949 | 13.0 | 13 | 0.6916 | 0.5 |
67
+ | 0.6949 | 14.0 | 14 | 0.6915 | 0.5 |
68
+ | 0.6949 | 15.0 | 15 | 0.6914 | 0.5 |
69
+ | 0.6949 | 16.0 | 16 | 0.6914 | 0.5312 |
70
+ | 0.6949 | 17.0 | 17 | 0.6913 | 0.5312 |
71
+ | 0.6949 | 18.0 | 18 | 0.6912 | 0.5312 |
72
+ | 0.6949 | 19.0 | 19 | 0.6911 | 0.625 |
73
+ | 0.6926 | 20.0 | 20 | 0.6910 | 0.625 |
74
+ | 0.6926 | 21.0 | 21 | 0.6909 | 0.6562 |
75
+ | 0.6926 | 22.0 | 22 | 0.6907 | 0.6875 |
76
+ | 0.6926 | 23.0 | 23 | 0.6906 | 0.6875 |
77
+ | 0.6926 | 24.0 | 24 | 0.6904 | 0.6875 |
78
+ | 0.6926 | 25.0 | 25 | 0.6902 | 0.75 |
79
+ | 0.6926 | 26.0 | 26 | 0.6899 | 0.75 |
80
+ | 0.6926 | 27.0 | 27 | 0.6896 | 0.75 |
81
+ | 0.6926 | 28.0 | 28 | 0.6893 | 0.7188 |
82
+ | 0.6926 | 29.0 | 29 | 0.6890 | 0.6875 |
83
+ | 0.687 | 30.0 | 30 | 0.6885 | 0.6875 |
84
+ | 0.687 | 31.0 | 31 | 0.6880 | 0.7188 |
85
+ | 0.687 | 32.0 | 32 | 0.6874 | 0.7188 |
86
+ | 0.687 | 33.0 | 33 | 0.6866 | 0.7188 |
87
+ | 0.687 | 34.0 | 34 | 0.6857 | 0.7188 |
88
+ | 0.687 | 35.0 | 35 | 0.6846 | 0.75 |
89
+ | 0.687 | 36.0 | 36 | 0.6832 | 0.75 |
90
+ | 0.687 | 37.0 | 37 | 0.6814 | 0.7812 |
91
+ | 0.687 | 38.0 | 38 | 0.6791 | 0.7812 |
92
+ | 0.687 | 39.0 | 39 | 0.6761 | 0.875 |
93
+ | 0.6732 | 40.0 | 40 | 0.6721 | 0.9062 |
94
+ | 0.6732 | 41.0 | 41 | 0.6670 | 0.9062 |
95
+ | 0.6732 | 42.0 | 42 | 0.6601 | 0.9062 |
96
+ | 0.6732 | 43.0 | 43 | 0.6510 | 0.875 |
97
+ | 0.6732 | 44.0 | 44 | 0.6392 | 0.875 |
98
+ | 0.6732 | 45.0 | 45 | 0.6248 | 0.875 |
99
+ | 0.6732 | 46.0 | 46 | 0.6098 | 0.875 |
100
+ | 0.6732 | 47.0 | 47 | 0.5961 | 0.875 |
101
+ | 0.6732 | 48.0 | 48 | 0.5884 | 0.9375 |
102
+ | 0.6732 | 49.0 | 49 | 0.5833 | 0.9375 |
103
+ | 0.5913 | 50.0 | 50 | 0.5795 | 0.9062 |
104
+ | 0.5913 | 51.0 | 51 | 0.5851 | 0.9062 |
105
+ | 0.5913 | 52.0 | 52 | 0.5985 | 0.875 |
106
+ | 0.5913 | 53.0 | 53 | 0.6110 | 0.8125 |
107
+ | 0.5913 | 54.0 | 54 | 0.6092 | 0.8438 |
108
+ | 0.5913 | 55.0 | 55 | 0.6007 | 0.8438 |
109
+ | 0.5913 | 56.0 | 56 | 0.5904 | 0.875 |
110
+ | 0.5913 | 57.0 | 57 | 0.5846 | 0.9062 |
111
+ | 0.5913 | 58.0 | 58 | 0.5829 | 0.9062 |
112
+ | 0.5913 | 59.0 | 59 | 0.5843 | 0.9062 |
113
+ | 0.544 | 60.0 | 60 | 0.5900 | 0.8438 |
114
+ | 0.544 | 61.0 | 61 | 0.5970 | 0.8438 |
115
+ | 0.544 | 62.0 | 62 | 0.6026 | 0.8438 |
116
+ | 0.544 | 63.0 | 63 | 0.6030 | 0.8438 |
117
+ | 0.544 | 64.0 | 64 | 0.5980 | 0.8438 |
118
+ | 0.544 | 65.0 | 65 | 0.5901 | 0.8438 |
119
+ | 0.544 | 66.0 | 66 | 0.5843 | 0.875 |
120
+ | 0.544 | 67.0 | 67 | 0.5800 | 0.9062 |
121
+ | 0.544 | 68.0 | 68 | 0.5779 | 0.9375 |
122
+ | 0.544 | 69.0 | 69 | 0.5765 | 0.9375 |
123
+ | 0.5383 | 70.0 | 70 | 0.5758 | 0.9688 |
124
+ | 0.5383 | 71.0 | 71 | 0.5754 | 0.9688 |
125
+ | 0.5383 | 72.0 | 72 | 0.5752 | 0.9688 |
126
+ | 0.5383 | 73.0 | 73 | 0.5751 | 0.9688 |
127
+ | 0.5383 | 74.0 | 74 | 0.5750 | 0.9688 |
128
+ | 0.5383 | 75.0 | 75 | 0.5750 | 0.9688 |
129
+
130
+
131
+ ### Framework versions
132
+
133
+ - Transformers 4.32.0.dev0
134
+ - Pytorch 2.0.1+cu118
135
+ - Datasets 2.4.0
136
+ - Tokenizers 0.13.3