pkemon_cap_v0
This model is a fine-tuned version of microsoft/git-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 7.6491
- Wer Score: 127.2727
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Wer Score |
---|---|---|---|---|
11.2497 | 0.17 | 2 | 10.0191 | 96.6364 |
9.9157 | 0.35 | 4 | 9.5544 | 111.1818 |
9.4907 | 0.52 | 6 | 9.1167 | 143.5909 |
9.0975 | 0.7 | 8 | 8.8422 | 154.5455 |
8.8568 | 0.87 | 10 | 8.6143 | 144.6364 |
8.6299 | 1.04 | 12 | 8.4336 | 118.7727 |
8.4659 | 1.22 | 14 | 8.2808 | 112.4091 |
8.3233 | 1.39 | 16 | 8.1538 | 124.3636 |
8.2213 | 1.57 | 18 | 8.0420 | 122.8636 |
8.0876 | 1.74 | 20 | 7.9463 | 124.5 |
7.9863 | 1.91 | 22 | 7.8647 | 153.9545 |
7.9169 | 2.09 | 24 | 7.7966 | 156.0 |
7.8652 | 2.26 | 26 | 7.7400 | 155.5455 |
7.8245 | 2.43 | 28 | 7.6962 | 142.0909 |
7.7512 | 2.61 | 30 | 7.6659 | 129.9545 |
7.7344 | 2.78 | 32 | 7.6491 | 127.2727 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 14
Inference API (serverless) does not yet support transformers models for this pipeline type.