helling100 commited on
Commit
95dfd4a
·
1 Parent(s): 8f64a26

Upload TFDistilBertForSequenceClassification

Browse files
Files changed (3) hide show
  1. README.md +21 -21
  2. config.json +6 -0
  3. tf_model.h5 +2 -2
README.md CHANGED
@@ -14,16 +14,16 @@ probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Train Loss: 0.1772
18
- - Train Mae: 0.2939
19
- - Train Mse: 0.1408
20
- - Train R2-score: 0.5387
21
- - Train Accuracy: 0.6538
22
- - Validation Loss: 0.1822
23
- - Validation Mae: 0.3656
24
- - Validation Mse: 0.1809
25
- - Validation R2-score: 0.7154
26
- - Validation Accuracy: 0.7027
27
  - Epoch: 9
28
 
29
  ## Model description
@@ -43,23 +43,23 @@ More information needed
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
- - optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 2e-06, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
47
  - training_precision: float32
48
 
49
  ### Training results
50
 
51
  | Train Loss | Train Mae | Train Mse | Train R2-score | Train Accuracy | Validation Loss | Validation Mae | Validation Mse | Validation R2-score | Validation Accuracy | Epoch |
52
  |:----------:|:---------:|:---------:|:--------------:|:--------------:|:---------------:|:--------------:|:--------------:|:-------------------:|:-------------------:|:-----:|
53
- | 1.5062 | 0.5873 | 0.4379 | -0.5882 | 0.4615 | 0.4702 | 0.6369 | 0.4700 | 0.2677 | 0.3243 | 0 |
54
- | 0.5726 | 0.4513 | 0.2799 | -0.2102 | 0.5538 | 0.2739 | 0.4733 | 0.2732 | 0.5908 | 0.3243 | 1 |
55
- | 0.1934 | 0.3138 | 0.1581 | 0.3070 | 0.6154 | 0.1815 | 0.3648 | 0.1802 | 0.7161 | 0.6351 | 2 |
56
- | 0.1820 | 0.2921 | 0.1420 | 0.5935 | 0.6731 | 0.1927 | 0.3805 | 0.1914 | 0.7023 | 0.6081 | 3 |
57
- | 0.1631 | 0.2892 | 0.1408 | 0.3709 | 0.6538 | 0.1837 | 0.3715 | 0.1824 | 0.7124 | 0.6216 | 4 |
58
- | 0.1821 | 0.2952 | 0.1445 | 0.3606 | 0.6615 | 0.1854 | 0.3706 | 0.1841 | 0.7113 | 0.5946 | 5 |
59
- | 0.1691 | 0.2740 | 0.1261 | 0.5228 | 0.7692 | 0.1748 | 0.3595 | 0.1734 | 0.7235 | 0.8514 | 6 |
60
- | 0.1525 | 0.2761 | 0.1273 | 0.4328 | 0.7577 | 0.1754 | 0.3607 | 0.1740 | 0.7227 | 0.8378 | 7 |
61
- | 0.1912 | 0.2910 | 0.1370 | 0.5786 | 0.7000 | 0.2063 | 0.3964 | 0.2052 | 0.6847 | 0.4865 | 8 |
62
- | 0.1772 | 0.2939 | 0.1408 | 0.5387 | 0.6538 | 0.1822 | 0.3656 | 0.1809 | 0.7154 | 0.7027 | 9 |
63
 
64
 
65
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Train Loss: 0.1548
18
+ - Train Mae: 0.2765
19
+ - Train Mse: 0.1336
20
+ - Train R2-score: 0.7547
21
+ - Train Accuracy: 0.7462
22
+ - Validation Loss: 0.1908
23
+ - Validation Mae: 0.3787
24
+ - Validation Mse: 0.1894
25
+ - Validation R2-score: 0.8458
26
+ - Validation Accuracy: 0.4595
27
  - Epoch: 9
28
 
29
  ## Model description
 
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
+ - optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 2e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
47
  - training_precision: float32
48
 
49
  ### Training results
50
 
51
  | Train Loss | Train Mae | Train Mse | Train R2-score | Train Accuracy | Validation Loss | Validation Mae | Validation Mse | Validation R2-score | Validation Accuracy | Epoch |
52
  |:----------:|:---------:|:---------:|:--------------:|:--------------:|:---------------:|:--------------:|:--------------:|:-------------------:|:-------------------:|:-----:|
53
+ | 0.5723 | 0.3984 | 0.2343 | 0.4755 | 0.5923 | 0.1856 | 0.3686 | 0.1843 | 0.8559 | 0.4324 | 0 |
54
+ | 0.1822 | 0.2906 | 0.1403 | 0.7246 | 0.6538 | 0.1577 | 0.3485 | 0.1561 | 0.8714 | 0.9459 | 1 |
55
+ | 0.1765 | 0.2865 | 0.1376 | 0.6770 | 0.6538 | 0.1356 | 0.3325 | 0.1337 | 0.8808 | 0.9459 | 2 |
56
+ | 0.1959 | 0.2945 | 0.1383 | 0.6806 | 0.7308 | 0.2115 | 0.4054 | 0.2104 | 0.8366 | 0.3243 | 3 |
57
+ | 0.1698 | 0.2906 | 0.1408 | 0.7195 | 0.6231 | 0.1489 | 0.3371 | 0.1472 | 0.8726 | 0.9459 | 4 |
58
+ | 0.2081 | 0.2687 | 0.1178 | 0.7632 | 0.8385 | 0.2547 | 0.4572 | 0.2539 | 0.8046 | 0.3243 | 5 |
59
+ | 0.1806 | 0.3087 | 0.1554 | 0.7168 | 0.6462 | 0.1477 | 0.3401 | 0.1460 | 0.8757 | 0.9459 | 6 |
60
+ | 0.1910 | 0.3102 | 0.1559 | 0.7295 | 0.6308 | 0.1726 | 0.3544 | 0.1711 | 0.8602 | 0.8919 | 7 |
61
+ | 0.1697 | 0.2609 | 0.1132 | 0.7876 | 0.8538 | 0.1856 | 0.3694 | 0.1843 | 0.8537 | 0.5946 | 8 |
62
+ | 0.1548 | 0.2765 | 0.1336 | 0.7547 | 0.7462 | 0.1908 | 0.3787 | 0.1894 | 0.8458 | 0.4595 | 9 |
63
 
64
 
65
  ### Framework versions
config.json CHANGED
@@ -8,7 +8,13 @@
8
  "dim": 768,
9
  "dropout": 0.1,
10
  "hidden_dim": 3072,
 
 
 
11
  "initializer_range": 0.02,
 
 
 
12
  "max_position_embeddings": 512,
13
  "model_type": "distilbert",
14
  "n_heads": 12,
 
8
  "dim": 768,
9
  "dropout": 0.1,
10
  "hidden_dim": 3072,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
  "initializer_range": 0.02,
15
+ "label2id": {
16
+ "LABEL_0": 0
17
+ },
18
  "max_position_embeddings": 512,
19
  "model_type": "distilbert",
20
  "n_heads": 12,
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f0fa9ae3ae37c376ed81b20ff67bf9729d8b16c84dfc16fe51ddbf4edb53f64a
3
- size 267955144
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b0014d906fdbb1ebe74a86abeb17610a837cb095ef13f2277886ceeec9b51611
3
+ size 267952072