gpt2_model / README.md
vasumathin298's picture
vasumathin298/model
e3363bb verified
metadata
library_name: transformers
license: mit
base_model: gpt2
tags:
  - generated_from_trainer
model-index:
  - name: gpt2_model
    results: []

gpt2_model

This model is a fine-tuned version of gpt2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0327

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 9 0.4230
No log 2.0 18 0.2894
No log 3.0 27 0.1959
No log 4.0 36 0.1219
No log 5.0 45 0.0686
No log 6.0 54 0.0468
No log 7.0 63 0.0402
No log 8.0 72 0.0360
No log 9.0 81 0.0336
No log 10.0 90 0.0327

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.1