Commit
·
daa9fda
1
Parent(s):
806c6fb
initial commit
Browse files- README.md +18 -0
- config.json +26 -0
- pytorch_model.bin +3 -0
- special_tokens_map.json +1 -0
- tokenizer.json +0 -0
- tokenizer_config.json +1 -0
- training.log +69 -0
- vocab.txt +0 -0
README.md
ADDED
@@ -0,0 +1,18 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: en
|
3 |
+
tags:
|
4 |
+
- bert
|
5 |
+
- cola
|
6 |
+
- glue
|
7 |
+
- kd
|
8 |
+
- torchdistill
|
9 |
+
license: apache-2.0
|
10 |
+
datasets:
|
11 |
+
- cola
|
12 |
+
metrics:
|
13 |
+
- matthew's correlation
|
14 |
+
---
|
15 |
+
|
16 |
+
`bert-base-uncased` fine-tuned on CoLA dataset, using fine-tuned `bert-large-uncased` as a teacher model, [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_kd_and_submission.ipynb) for knowledge distillation.
|
17 |
+
The training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/cola/kd/bert_base_uncased_from_bert_large_uncased.yaml).
|
18 |
+
I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **78.9**.
|
config.json
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "bert-base-uncased",
|
3 |
+
"architectures": [
|
4 |
+
"BertForSequenceClassification"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"finetuning_task": "cola",
|
8 |
+
"gradient_checkpointing": false,
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 3072,
|
14 |
+
"layer_norm_eps": 1e-12,
|
15 |
+
"max_position_embeddings": 512,
|
16 |
+
"model_type": "bert",
|
17 |
+
"num_attention_heads": 12,
|
18 |
+
"num_hidden_layers": 12,
|
19 |
+
"pad_token_id": 0,
|
20 |
+
"position_embedding_type": "absolute",
|
21 |
+
"problem_type": "single_label_classification",
|
22 |
+
"transformers_version": "4.6.1",
|
23 |
+
"type_vocab_size": 2,
|
24 |
+
"use_cache": true,
|
25 |
+
"vocab_size": 30522
|
26 |
+
}
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:e45ce2e84137cdf2360767e72f37d2224f4bb63c9225b534bfc9505981affeca
|
3 |
+
size 438024457
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
|
training.log
ADDED
@@ -0,0 +1,69 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2021-05-31 00:20:49,022 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/cola/kd/bert_base_uncased_from_bert_large_uncased.yaml', log='log/glue/cola/kd/bert_base_uncased_from_bert_large_uncased.txt', private_output='leaderboard/glue/kd/bert_base_uncased_from_bert_large_uncased/', seed=None, student_only=False, task_name='cola', test_only=False, world_size=1)
|
2 |
+
2021-05-31 00:20:49,055 INFO __main__ Distributed environment: NO
|
3 |
+
Num processes: 1
|
4 |
+
Process index: 0
|
5 |
+
Local process index: 0
|
6 |
+
Device: cuda
|
7 |
+
Use FP16 precision: True
|
8 |
+
|
9 |
+
2021-05-31 00:20:57,823 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/cola/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
|
10 |
+
2021-05-31 00:20:58,582 INFO __main__ Start training
|
11 |
+
2021-05-31 00:20:58,582 INFO torchdistill.models.util [teacher model]
|
12 |
+
2021-05-31 00:20:58,582 INFO torchdistill.models.util Using the original teacher model
|
13 |
+
2021-05-31 00:20:58,582 INFO torchdistill.models.util [student model]
|
14 |
+
2021-05-31 00:20:58,582 INFO torchdistill.models.util Using the original student model
|
15 |
+
2021-05-31 00:20:58,583 INFO torchdistill.core.distillation Loss = 1.0 * OrgLoss
|
16 |
+
2021-05-31 00:20:58,583 INFO torchdistill.core.distillation Freezing the whole teacher model
|
17 |
+
2021-05-31 00:21:01,585 INFO torchdistill.misc.log Epoch: [0] [ 0/535] eta: 0:01:04 lr: 9.993769470404985e-05 sample/s: 34.32832986855674 loss: 0.2715 (0.2715) time: 0.1200 data: 0.0034 max mem: 1758
|
18 |
+
2021-05-31 00:21:07,875 INFO torchdistill.misc.log Epoch: [0] [ 50/535] eta: 0:01:00 lr: 9.682242990654206e-05 sample/s: 40.77632733415159 loss: 0.1915 (0.2239) time: 0.1222 data: 0.0017 max mem: 2766
|
19 |
+
2021-05-31 00:21:14,086 INFO torchdistill.misc.log Epoch: [0] [100/535] eta: 0:00:54 lr: 9.370716510903426e-05 sample/s: 40.664152406805954 loss: 0.1580 (0.2037) time: 0.1245 data: 0.0017 max mem: 2833
|
20 |
+
2021-05-31 00:21:20,465 INFO torchdistill.misc.log Epoch: [0] [150/535] eta: 0:00:48 lr: 9.059190031152648e-05 sample/s: 27.442853614372105 loss: 0.2045 (0.1980) time: 0.1318 data: 0.0018 max mem: 2833
|
21 |
+
2021-05-31 00:21:26,554 INFO torchdistill.misc.log Epoch: [0] [200/535] eta: 0:00:41 lr: 8.74766355140187e-05 sample/s: 32.08432808705131 loss: 0.1471 (0.1922) time: 0.1235 data: 0.0016 max mem: 2833
|
22 |
+
2021-05-31 00:21:32,711 INFO torchdistill.misc.log Epoch: [0] [250/535] eta: 0:00:35 lr: 8.436137071651092e-05 sample/s: 32.09114018963311 loss: 0.1522 (0.1857) time: 0.1219 data: 0.0017 max mem: 2833
|
23 |
+
2021-05-31 00:21:38,845 INFO torchdistill.misc.log Epoch: [0] [300/535] eta: 0:00:29 lr: 8.124610591900313e-05 sample/s: 32.025533052093074 loss: 0.1108 (0.1777) time: 0.1217 data: 0.0017 max mem: 2833
|
24 |
+
2021-05-31 00:21:45,058 INFO torchdistill.misc.log Epoch: [0] [350/535] eta: 0:00:22 lr: 7.813084112149533e-05 sample/s: 32.00012207077816 loss: 0.1630 (0.1759) time: 0.1264 data: 0.0017 max mem: 2836
|
25 |
+
2021-05-31 00:21:51,311 INFO torchdistill.misc.log Epoch: [0] [400/535] eta: 0:00:16 lr: 7.501557632398754e-05 sample/s: 27.491013960804878 loss: 0.1091 (0.1707) time: 0.1256 data: 0.0017 max mem: 2836
|
26 |
+
2021-05-31 00:21:57,642 INFO torchdistill.misc.log Epoch: [0] [450/535] eta: 0:00:10 lr: 7.190031152647976e-05 sample/s: 32.02229335226741 loss: 0.1059 (0.1657) time: 0.1231 data: 0.0016 max mem: 2836
|
27 |
+
2021-05-31 00:22:03,683 INFO torchdistill.misc.log Epoch: [0] [500/535] eta: 0:00:04 lr: 6.878504672897197e-05 sample/s: 40.68426872562904 loss: 0.1292 (0.1631) time: 0.1142 data: 0.0017 max mem: 2915
|
28 |
+
2021-05-31 00:22:07,877 INFO torchdistill.misc.log Epoch: [0] Total time: 0:01:06
|
29 |
+
2021-05-31 00:22:08,938 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
30 |
+
2021-05-31 00:22:08,938 INFO __main__ Validation: matthews_correlation = 0.5696227147853862
|
31 |
+
2021-05-31 00:22:08,938 INFO __main__ Updating ckpt at ./resource/ckpt/glue/cola/kd/cola-bert-base-uncased_from_bert-large-uncased
|
32 |
+
2021-05-31 00:22:10,190 INFO torchdistill.misc.log Epoch: [1] [ 0/535] eta: 0:00:55 lr: 6.660436137071651e-05 sample/s: 39.90689067075158 loss: 0.0587 (0.0587) time: 0.1033 data: 0.0031 max mem: 2915
|
33 |
+
2021-05-31 00:22:16,375 INFO torchdistill.misc.log Epoch: [1] [ 50/535] eta: 0:00:59 lr: 6.348909657320873e-05 sample/s: 40.69028776542207 loss: 0.0609 (0.0709) time: 0.1267 data: 0.0016 max mem: 2915
|
34 |
+
2021-05-31 00:22:22,636 INFO torchdistill.misc.log Epoch: [1] [100/535] eta: 0:00:54 lr: 6.037383177570094e-05 sample/s: 40.703219921200244 loss: 0.0721 (0.0677) time: 0.1257 data: 0.0017 max mem: 2915
|
35 |
+
2021-05-31 00:22:28,858 INFO torchdistill.misc.log Epoch: [1] [150/535] eta: 0:00:47 lr: 5.7258566978193154e-05 sample/s: 23.86856736377863 loss: 0.0641 (0.0676) time: 0.1254 data: 0.0017 max mem: 2915
|
36 |
+
2021-05-31 00:22:35,146 INFO torchdistill.misc.log Epoch: [1] [200/535] eta: 0:00:41 lr: 5.414330218068536e-05 sample/s: 32.09961332488936 loss: 0.0410 (0.0655) time: 0.1278 data: 0.0017 max mem: 2915
|
37 |
+
2021-05-31 00:22:41,551 INFO torchdistill.misc.log Epoch: [1] [250/535] eta: 0:00:35 lr: 5.1028037383177574e-05 sample/s: 32.12475203541585 loss: 0.0638 (0.0681) time: 0.1273 data: 0.0017 max mem: 2915
|
38 |
+
2021-05-31 00:22:47,871 INFO torchdistill.misc.log Epoch: [1] [300/535] eta: 0:00:29 lr: 4.791277258566979e-05 sample/s: 27.227165470614775 loss: 0.0764 (0.0700) time: 0.1252 data: 0.0017 max mem: 2915
|
39 |
+
2021-05-31 00:22:54,042 INFO torchdistill.misc.log Epoch: [1] [350/535] eta: 0:00:23 lr: 4.4797507788161994e-05 sample/s: 32.06403167947466 loss: 0.0372 (0.0682) time: 0.1282 data: 0.0017 max mem: 2915
|
40 |
+
2021-05-31 00:23:00,212 INFO torchdistill.misc.log Epoch: [1] [400/535] eta: 0:00:16 lr: 4.168224299065421e-05 sample/s: 32.087089401662 loss: 0.0755 (0.0690) time: 0.1181 data: 0.0016 max mem: 2915
|
41 |
+
2021-05-31 00:23:06,665 INFO torchdistill.misc.log Epoch: [1] [450/535] eta: 0:00:10 lr: 3.856697819314642e-05 sample/s: 27.528814883122593 loss: 0.0612 (0.0692) time: 0.1331 data: 0.0017 max mem: 2917
|
42 |
+
2021-05-31 00:23:13,008 INFO torchdistill.misc.log Epoch: [1] [500/535] eta: 0:00:04 lr: 3.545171339563863e-05 sample/s: 32.032503684921984 loss: 0.0505 (0.0688) time: 0.1282 data: 0.0017 max mem: 2917
|
43 |
+
2021-05-31 00:23:17,244 INFO torchdistill.misc.log Epoch: [1] Total time: 0:01:07
|
44 |
+
2021-05-31 00:23:18,310 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
45 |
+
2021-05-31 00:23:18,310 INFO __main__ Validation: matthews_correlation = 0.5915362000526524
|
46 |
+
2021-05-31 00:23:18,310 INFO __main__ Updating ckpt at ./resource/ckpt/glue/cola/kd/cola-bert-base-uncased_from_bert-large-uncased
|
47 |
+
2021-05-31 00:23:19,657 INFO torchdistill.misc.log Epoch: [2] [ 0/535] eta: 0:01:08 lr: 3.327102803738318e-05 sample/s: 31.84517412406494 loss: 0.0010 (0.0010) time: 0.1285 data: 0.0029 max mem: 2917
|
48 |
+
2021-05-31 00:23:25,866 INFO torchdistill.misc.log Epoch: [2] [ 50/535] eta: 0:01:00 lr: 3.015576323987539e-05 sample/s: 40.70726907893619 loss: 0.0024 (0.0234) time: 0.1229 data: 0.0017 max mem: 2917
|
49 |
+
2021-05-31 00:23:32,177 INFO torchdistill.misc.log Epoch: [2] [100/535] eta: 0:00:54 lr: 2.7040498442367603e-05 sample/s: 31.97889576352276 loss: 0.0047 (0.0244) time: 0.1282 data: 0.0017 max mem: 2917
|
50 |
+
2021-05-31 00:23:38,580 INFO torchdistill.misc.log Epoch: [2] [150/535] eta: 0:00:48 lr: 2.3925233644859816e-05 sample/s: 32.0352560858947 loss: 0.0017 (0.0278) time: 0.1293 data: 0.0017 max mem: 2917
|
51 |
+
2021-05-31 00:23:45,020 INFO torchdistill.misc.log Epoch: [2] [200/535] eta: 0:00:42 lr: 2.0809968847352026e-05 sample/s: 32.08377588115941 loss: 0.0033 (0.0264) time: 0.1303 data: 0.0017 max mem: 2917
|
52 |
+
2021-05-31 00:23:51,367 INFO torchdistill.misc.log Epoch: [2] [250/535] eta: 0:00:36 lr: 1.769470404984424e-05 sample/s: 40.8114407758866 loss: 0.0048 (0.0263) time: 0.1203 data: 0.0016 max mem: 2917
|
53 |
+
2021-05-31 00:23:57,611 INFO torchdistill.misc.log Epoch: [2] [300/535] eta: 0:00:29 lr: 1.457943925233645e-05 sample/s: 32.14740440404381 loss: 0.0006 (0.0274) time: 0.1241 data: 0.0016 max mem: 2917
|
54 |
+
2021-05-31 00:24:03,995 INFO torchdistill.misc.log Epoch: [2] [350/535] eta: 0:00:23 lr: 1.1464174454828661e-05 sample/s: 32.08083110406584 loss: 0.0287 (0.0277) time: 0.1269 data: 0.0017 max mem: 2917
|
55 |
+
2021-05-31 00:24:10,245 INFO torchdistill.misc.log Epoch: [2] [400/535] eta: 0:00:17 lr: 8.348909657320873e-06 sample/s: 40.49562511917779 loss: 0.0076 (0.0275) time: 0.1253 data: 0.0016 max mem: 2917
|
56 |
+
2021-05-31 00:24:16,391 INFO torchdistill.misc.log Epoch: [2] [450/535] eta: 0:00:10 lr: 5.233644859813085e-06 sample/s: 32.0666056318915 loss: 0.0009 (0.0268) time: 0.1182 data: 0.0016 max mem: 2917
|
57 |
+
2021-05-31 00:24:22,587 INFO torchdistill.misc.log Epoch: [2] [500/535] eta: 0:00:04 lr: 2.118380062305296e-06 sample/s: 32.093043185504854 loss: 0.0009 (0.0269) time: 0.1207 data: 0.0018 max mem: 2917
|
58 |
+
2021-05-31 00:24:26,813 INFO torchdistill.misc.log Epoch: [2] Total time: 0:01:07
|
59 |
+
2021-05-31 00:24:27,875 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
60 |
+
2021-05-31 00:24:27,875 INFO __main__ Validation: matthews_correlation = 0.6175138437153591
|
61 |
+
2021-05-31 00:24:27,876 INFO __main__ Updating ckpt at ./resource/ckpt/glue/cola/kd/cola-bert-base-uncased_from_bert-large-uncased
|
62 |
+
2021-05-31 00:24:29,073 INFO __main__ [Teacher: bert-large-uncased]
|
63 |
+
2021-05-31 00:24:31,822 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
64 |
+
2021-05-31 00:24:31,822 INFO __main__ Test: matthews_correlation = 0.6335324951654004
|
65 |
+
2021-05-31 00:24:33,639 INFO __main__ [Student: bert-base-uncased]
|
66 |
+
2021-05-31 00:24:34,706 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
67 |
+
2021-05-31 00:24:34,706 INFO __main__ Test: matthews_correlation = 0.6175138437153591
|
68 |
+
2021-05-31 00:24:34,706 INFO __main__ Start prediction for private dataset(s)
|
69 |
+
2021-05-31 00:24:34,707 INFO __main__ cola/test: 1063 samples
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|