metadata
license: apache-2.0
base_model: GanjinZero/biobart-base
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: fine-tuned-BioBART-15-epochs-1024-input-128-output
results: []
fine-tuned-BioBART-15-epochs-1024-input-128-output
This model is a fine-tuned version of GanjinZero/biobart-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.5422
- Rouge1: 0.1912
- Rouge2: 0.042
- Rougel: 0.1492
- Rougelsum: 0.15
- Gen Len: 29.97
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 151 | 5.4038 | 0.0026 | 0.0006 | 0.0024 | 0.0024 | 5.98 |
No log | 2.0 | 302 | 1.8682 | 0.0475 | 0.0066 | 0.0452 | 0.043 | 6.59 |
No log | 3.0 | 453 | 1.6512 | 0.0753 | 0.0169 | 0.0587 | 0.0586 | 22.62 |
4.1375 | 4.0 | 604 | 1.5702 | 0.1472 | 0.0366 | 0.1124 | 0.1113 | 42.92 |
4.1375 | 5.0 | 755 | 1.5256 | 0.167 | 0.0337 | 0.1309 | 0.1305 | 45.89 |
4.1375 | 6.0 | 906 | 1.5057 | 0.1435 | 0.0305 | 0.1132 | 0.1134 | 32.45 |
1.1893 | 7.0 | 1057 | 1.4854 | 0.1655 | 0.0388 | 0.129 | 0.1295 | 34.34 |
1.1893 | 8.0 | 1208 | 1.4845 | 0.1635 | 0.0423 | 0.1238 | 0.1252 | 37.77 |
1.1893 | 9.0 | 1359 | 1.4980 | 0.1712 | 0.0363 | 0.1382 | 0.1388 | 29.68 |
0.8262 | 10.0 | 1510 | 1.5052 | 0.1917 | 0.0431 | 0.1486 | 0.1497 | 32.88 |
0.8262 | 11.0 | 1661 | 1.5167 | 0.1731 | 0.0374 | 0.1402 | 0.1403 | 29.9 |
0.8262 | 12.0 | 1812 | 1.5267 | 0.1675 | 0.035 | 0.1335 | 0.1337 | 29.35 |
0.8262 | 13.0 | 1963 | 1.5329 | 0.1839 | 0.0401 | 0.1465 | 0.1465 | 28.23 |
0.61 | 14.0 | 2114 | 1.5440 | 0.1904 | 0.0452 | 0.1522 | 0.1527 | 29.33 |
0.61 | 15.0 | 2265 | 1.5422 | 0.1912 | 0.042 | 0.1492 | 0.15 | 29.97 |
Framework versions
- Transformers 4.36.2
- Pytorch 1.12.1+cu113
- Datasets 2.16.1
- Tokenizers 0.15.0