kwwww commited on
Commit
a4d8175
·
1 Parent(s): 6cf3249

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +55 -55
README.md CHANGED
@@ -21,11 +21,11 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the bionlp2004 dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.1710
25
- - Precision: 0.7741
26
- - Recall: 0.8224
27
- - F1: 0.7975
28
- - Accuracy: 0.9480
29
 
30
  ## Model description
31
 
@@ -56,56 +56,56 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
- | 0.2323 | 1.0 | 1039 | 0.1995 | 0.6662 | 0.7625 | 0.7111 | 0.9308 |
60
- | 0.2094 | 2.0 | 2078 | 0.1827 | 0.7369 | 0.7693 | 0.7528 | 0.9392 |
61
- | 0.1976 | 3.0 | 3117 | 0.1748 | 0.7272 | 0.7886 | 0.7566 | 0.9402 |
62
- | 0.1924 | 4.0 | 4156 | 0.1770 | 0.7284 | 0.7940 | 0.7598 | 0.9402 |
63
- | 0.1841 | 5.0 | 5195 | 0.1703 | 0.7421 | 0.7852 | 0.7630 | 0.9422 |
64
- | 0.1821 | 6.0 | 6234 | 0.1733 | 0.7337 | 0.7961 | 0.7636 | 0.9411 |
65
- | 0.1789 | 7.0 | 7273 | 0.1727 | 0.7358 | 0.7765 | 0.7556 | 0.9418 |
66
- | 0.1782 | 8.0 | 8312 | 0.1682 | 0.7395 | 0.8042 | 0.7705 | 0.9435 |
67
- | 0.178 | 9.0 | 9351 | 0.1636 | 0.7453 | 0.8055 | 0.7742 | 0.9443 |
68
- | 0.1687 | 10.0 | 10390 | 0.1915 | 0.7169 | 0.7654 | 0.7404 | 0.9362 |
69
- | 0.1671 | 11.0 | 11429 | 0.1655 | 0.7552 | 0.7781 | 0.7665 | 0.9428 |
70
- | 0.1633 | 12.0 | 12468 | 0.1755 | 0.7123 | 0.8251 | 0.7646 | 0.9398 |
71
- | 0.1616 | 13.0 | 13507 | 0.1639 | 0.7405 | 0.7900 | 0.7645 | 0.9435 |
72
- | 0.1611 | 14.0 | 14546 | 0.1620 | 0.7632 | 0.7979 | 0.7802 | 0.9460 |
73
- | 0.1616 | 15.0 | 15585 | 0.1624 | 0.7494 | 0.8006 | 0.7742 | 0.9445 |
74
- | 0.1577 | 16.0 | 16624 | 0.1680 | 0.7551 | 0.8001 | 0.7770 | 0.9456 |
75
- | 0.1567 | 17.0 | 17663 | 0.1666 | 0.7446 | 0.8133 | 0.7774 | 0.9446 |
76
- | 0.1511 | 18.0 | 18702 | 0.1678 | 0.7479 | 0.8133 | 0.7792 | 0.9437 |
77
- | 0.1537 | 19.0 | 19741 | 0.1610 | 0.7651 | 0.7988 | 0.7816 | 0.9469 |
78
- | 0.15 | 20.0 | 20780 | 0.1613 | 0.7498 | 0.8228 | 0.7846 | 0.9458 |
79
- | 0.1499 | 21.0 | 21819 | 0.1634 | 0.7480 | 0.8088 | 0.7772 | 0.9434 |
80
- | 0.1496 | 22.0 | 22858 | 0.1635 | 0.7508 | 0.8149 | 0.7815 | 0.9456 |
81
- | 0.1487 | 23.0 | 23897 | 0.1603 | 0.7537 | 0.8253 | 0.7879 | 0.9462 |
82
- | 0.1425 | 24.0 | 24936 | 0.1614 | 0.7642 | 0.8077 | 0.7853 | 0.9458 |
83
- | 0.1427 | 25.0 | 25975 | 0.1727 | 0.7601 | 0.8021 | 0.7805 | 0.9456 |
84
- | 0.1408 | 26.0 | 27014 | 0.1616 | 0.7690 | 0.8152 | 0.7914 | 0.9462 |
85
- | 0.1401 | 27.0 | 28053 | 0.1613 | 0.7661 | 0.8032 | 0.7842 | 0.9463 |
86
- | 0.1387 | 28.0 | 29092 | 0.1642 | 0.7585 | 0.8169 | 0.7866 | 0.9457 |
87
- | 0.1354 | 29.0 | 30131 | 0.1650 | 0.7453 | 0.8214 | 0.7815 | 0.9455 |
88
- | 0.1331 | 30.0 | 31170 | 0.1622 | 0.7744 | 0.8152 | 0.7943 | 0.9477 |
89
- | 0.1325 | 31.0 | 32209 | 0.1626 | 0.7592 | 0.8142 | 0.7857 | 0.9454 |
90
- | 0.1338 | 32.0 | 33248 | 0.1628 | 0.7564 | 0.8152 | 0.7847 | 0.9455 |
91
- | 0.1296 | 33.0 | 34287 | 0.1660 | 0.7706 | 0.8203 | 0.7947 | 0.9469 |
92
- | 0.1323 | 34.0 | 35326 | 0.1647 | 0.7674 | 0.8120 | 0.7890 | 0.9466 |
93
- | 0.1275 | 35.0 | 36365 | 0.1644 | 0.7715 | 0.8118 | 0.7912 | 0.9469 |
94
- | 0.1245 | 36.0 | 37404 | 0.1607 | 0.7717 | 0.8280 | 0.7989 | 0.9486 |
95
- | 0.1261 | 37.0 | 38443 | 0.1620 | 0.7691 | 0.8230 | 0.7951 | 0.9476 |
96
- | 0.1221 | 38.0 | 39482 | 0.1680 | 0.7645 | 0.8116 | 0.7874 | 0.9468 |
97
- | 0.1211 | 39.0 | 40521 | 0.1682 | 0.7615 | 0.8172 | 0.7884 | 0.9451 |
98
- | 0.1236 | 40.0 | 41560 | 0.1625 | 0.7730 | 0.8246 | 0.7979 | 0.9472 |
99
- | 0.1165 | 41.0 | 42599 | 0.1714 | 0.7629 | 0.8149 | 0.7881 | 0.9456 |
100
- | 0.1188 | 42.0 | 43638 | 0.1677 | 0.7729 | 0.8143 | 0.7931 | 0.9474 |
101
- | 0.1166 | 43.0 | 44677 | 0.1702 | 0.7674 | 0.8318 | 0.7983 | 0.9467 |
102
- | 0.1159 | 44.0 | 45716 | 0.1720 | 0.7709 | 0.8235 | 0.7963 | 0.9472 |
103
- | 0.1135 | 45.0 | 46755 | 0.1707 | 0.7772 | 0.8197 | 0.7979 | 0.9475 |
104
- | 0.1134 | 46.0 | 47794 | 0.1710 | 0.7742 | 0.8174 | 0.7952 | 0.9477 |
105
- | 0.1121 | 47.0 | 48833 | 0.1682 | 0.7756 | 0.8248 | 0.7994 | 0.9478 |
106
- | 0.1098 | 48.0 | 49872 | 0.1711 | 0.7724 | 0.8206 | 0.7958 | 0.9475 |
107
- | 0.1108 | 49.0 | 50911 | 0.1712 | 0.7741 | 0.8208 | 0.7968 | 0.9476 |
108
- | 0.1062 | 50.0 | 51950 | 0.1710 | 0.7741 | 0.8224 | 0.7975 | 0.9480 |
109
 
110
 
111
  ### Framework versions
 
21
 
22
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the bionlp2004 dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.1660
25
+ - Precision: 0.7730
26
+ - Recall: 0.8185
27
+ - F1: 0.7951
28
+ - Accuracy: 0.9478
29
 
30
  ## Model description
31
 
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
+ | 0.233 | 1.0 | 1039 | 0.1956 | 0.6860 | 0.7301 | 0.7073 | 0.9331 |
60
+ | 0.2099 | 2.0 | 2078 | 0.1890 | 0.7214 | 0.7508 | 0.7358 | 0.9370 |
61
+ | 0.1991 | 3.0 | 3117 | 0.1862 | 0.7058 | 0.7785 | 0.7404 | 0.9375 |
62
+ | 0.19 | 4.0 | 4156 | 0.1736 | 0.7420 | 0.7943 | 0.7673 | 0.9434 |
63
+ | 0.1894 | 5.0 | 5195 | 0.1748 | 0.7319 | 0.7722 | 0.7515 | 0.9417 |
64
+ | 0.1814 | 6.0 | 6234 | 0.1686 | 0.7351 | 0.7952 | 0.7640 | 0.9427 |
65
+ | 0.177 | 7.0 | 7273 | 0.1682 | 0.7404 | 0.8086 | 0.7730 | 0.9448 |
66
+ | 0.1756 | 8.0 | 8312 | 0.1740 | 0.7386 | 0.7796 | 0.7585 | 0.9423 |
67
+ | 0.1761 | 9.0 | 9351 | 0.1691 | 0.7442 | 0.7664 | 0.7551 | 0.9430 |
68
+ | 0.1693 | 10.0 | 10390 | 0.1641 | 0.7506 | 0.8113 | 0.7797 | 0.9446 |
69
+ | 0.1697 | 11.0 | 11429 | 0.1669 | 0.7297 | 0.7938 | 0.7604 | 0.9427 |
70
+ | 0.1607 | 12.0 | 12468 | 0.1654 | 0.7593 | 0.8185 | 0.7878 | 0.9454 |
71
+ | 0.1643 | 13.0 | 13507 | 0.1652 | 0.7288 | 0.8035 | 0.7644 | 0.9430 |
72
+ | 0.1618 | 14.0 | 14546 | 0.1592 | 0.7548 | 0.7988 | 0.7762 | 0.9464 |
73
+ | 0.1598 | 15.0 | 15585 | 0.1641 | 0.7575 | 0.8006 | 0.7785 | 0.9454 |
74
+ | 0.16 | 16.0 | 16624 | 0.1621 | 0.7440 | 0.8174 | 0.7790 | 0.9456 |
75
+ | 0.1572 | 17.0 | 17663 | 0.1669 | 0.7598 | 0.8015 | 0.7801 | 0.9453 |
76
+ | 0.1528 | 18.0 | 18702 | 0.1680 | 0.7332 | 0.8073 | 0.7685 | 0.9427 |
77
+ | 0.1513 | 19.0 | 19741 | 0.1653 | 0.7630 | 0.7920 | 0.7772 | 0.9453 |
78
+ | 0.1504 | 20.0 | 20780 | 0.1635 | 0.7645 | 0.8073 | 0.7853 | 0.9461 |
79
+ | 0.1491 | 21.0 | 21819 | 0.1591 | 0.7547 | 0.8262 | 0.7889 | 0.9473 |
80
+ | 0.1455 | 22.0 | 22858 | 0.1627 | 0.7634 | 0.8145 | 0.7881 | 0.9457 |
81
+ | 0.145 | 23.0 | 23897 | 0.1584 | 0.7529 | 0.8210 | 0.7855 | 0.9464 |
82
+ | 0.1438 | 24.0 | 24936 | 0.1603 | 0.7592 | 0.8012 | 0.7796 | 0.9466 |
83
+ | 0.1413 | 25.0 | 25975 | 0.1614 | 0.7699 | 0.8134 | 0.7911 | 0.9470 |
84
+ | 0.1437 | 26.0 | 27014 | 0.1594 | 0.7557 | 0.8226 | 0.7877 | 0.9465 |
85
+ | 0.1414 | 27.0 | 28053 | 0.1605 | 0.7680 | 0.8183 | 0.7923 | 0.9478 |
86
+ | 0.1385 | 28.0 | 29092 | 0.1631 | 0.7588 | 0.8028 | 0.7802 | 0.9459 |
87
+ | 0.1365 | 29.0 | 30131 | 0.1568 | 0.7701 | 0.8167 | 0.7927 | 0.9482 |
88
+ | 0.1352 | 30.0 | 31170 | 0.1607 | 0.7660 | 0.8271 | 0.7954 | 0.9481 |
89
+ | 0.1331 | 31.0 | 32209 | 0.1646 | 0.7627 | 0.8122 | 0.7867 | 0.9461 |
90
+ | 0.1328 | 32.0 | 33248 | 0.1658 | 0.7560 | 0.8176 | 0.7856 | 0.9464 |
91
+ | 0.1319 | 33.0 | 34287 | 0.1579 | 0.7639 | 0.8228 | 0.7923 | 0.9486 |
92
+ | 0.1309 | 34.0 | 35326 | 0.1595 | 0.7666 | 0.8151 | 0.7901 | 0.9471 |
93
+ | 0.1271 | 35.0 | 36365 | 0.1616 | 0.7645 | 0.8248 | 0.7935 | 0.9476 |
94
+ | 0.1262 | 36.0 | 37404 | 0.1615 | 0.7641 | 0.8104 | 0.7866 | 0.9464 |
95
+ | 0.1227 | 37.0 | 38443 | 0.1614 | 0.7667 | 0.8273 | 0.7958 | 0.9475 |
96
+ | 0.1207 | 38.0 | 39482 | 0.1640 | 0.7763 | 0.8014 | 0.7887 | 0.9472 |
97
+ | 0.1212 | 39.0 | 40521 | 0.1613 | 0.7716 | 0.8142 | 0.7923 | 0.9487 |
98
+ | 0.1192 | 40.0 | 41560 | 0.1596 | 0.7773 | 0.8196 | 0.7979 | 0.9493 |
99
+ | 0.1193 | 41.0 | 42599 | 0.1684 | 0.7769 | 0.8071 | 0.7917 | 0.9473 |
100
+ | 0.1171 | 42.0 | 43638 | 0.1636 | 0.7717 | 0.8183 | 0.7943 | 0.9471 |
101
+ | 0.1146 | 43.0 | 44677 | 0.1613 | 0.7675 | 0.8217 | 0.7937 | 0.9476 |
102
+ | 0.1154 | 44.0 | 45716 | 0.1648 | 0.7725 | 0.8066 | 0.7892 | 0.9467 |
103
+ | 0.1149 | 45.0 | 46755 | 0.1660 | 0.7745 | 0.8172 | 0.7953 | 0.9476 |
104
+ | 0.1133 | 46.0 | 47794 | 0.1655 | 0.7742 | 0.8187 | 0.7958 | 0.9480 |
105
+ | 0.1121 | 47.0 | 48833 | 0.1659 | 0.7768 | 0.8156 | 0.7957 | 0.9481 |
106
+ | 0.1104 | 48.0 | 49872 | 0.1663 | 0.7714 | 0.8129 | 0.7916 | 0.9478 |
107
+ | 0.1069 | 49.0 | 50911 | 0.1659 | 0.7746 | 0.8163 | 0.7949 | 0.9479 |
108
+ | 0.11 | 50.0 | 51950 | 0.1660 | 0.7730 | 0.8185 | 0.7951 | 0.9478 |
109
 
110
 
111
  ### Framework versions