metadata
license: apache-2.0
base_model: GanjinZero/biobart-base
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: fine-tuned-BioBART-15-epochs-1500-input-256-output
results: []
fine-tuned-BioBART-15-epochs-1500-input-256-output
This model is a fine-tuned version of GanjinZero/biobart-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8909
- Rouge1: 0.1801
- Rouge2: 0.0468
- Rougel: 0.1405
- Rougelsum: 0.1403
- Gen Len: 36.53
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 301 | 1.4869 | 0.086 | 0.0035 | 0.0743 | 0.0744 | 10.36 |
4.753 | 2.0 | 602 | 0.9880 | 0.0958 | 0.0233 | 0.0687 | 0.0686 | 47.88 |
4.753 | 3.0 | 903 | 0.9108 | 0.0622 | 0.009 | 0.0617 | 0.0622 | 10.39 |
0.9239 | 4.0 | 1204 | 0.8681 | 0.1159 | 0.0272 | 0.085 | 0.0849 | 27.95 |
0.7513 | 5.0 | 1505 | 0.8490 | 0.1732 | 0.0333 | 0.1345 | 0.1352 | 37.9 |
0.7513 | 6.0 | 1806 | 0.8486 | 0.1523 | 0.0297 | 0.12 | 0.1209 | 33.96 |
0.5751 | 7.0 | 2107 | 0.8402 | 0.1825 | 0.0426 | 0.1433 | 0.1437 | 40.27 |
0.5751 | 8.0 | 2408 | 0.8500 | 0.1712 | 0.0303 | 0.1425 | 0.1427 | 31.42 |
0.4702 | 9.0 | 2709 | 0.8542 | 0.1764 | 0.0325 | 0.1275 | 0.1273 | 44.93 |
0.3782 | 10.0 | 3010 | 0.8615 | 0.1667 | 0.042 | 0.1328 | 0.1335 | 36.86 |
0.3782 | 11.0 | 3311 | 0.8714 | 0.1756 | 0.0358 | 0.1364 | 0.1359 | 35.21 |
0.3005 | 12.0 | 3612 | 0.8772 | 0.1801 | 0.0368 | 0.1427 | 0.1427 | 33.99 |
0.3005 | 13.0 | 3913 | 0.8818 | 0.1685 | 0.0397 | 0.1323 | 0.1331 | 34.67 |
0.2417 | 14.0 | 4214 | 0.8891 | 0.189 | 0.0495 | 0.145 | 0.1445 | 36.0 |
0.2148 | 15.0 | 4515 | 0.8909 | 0.1801 | 0.0468 | 0.1405 | 0.1403 | 36.53 |
Framework versions
- Transformers 4.36.2
- Pytorch 1.12.1+cu113
- Datasets 2.16.1
- Tokenizers 0.15.0