YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

gpt2-medium-nlg-multiwoz21

This model is a fine-tuned version of GPT2-medium on MultiWOZ21.

Refer to ConvLab-3 for model description and usage.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-5
  • train_batch_size: 64
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: AdamW
  • lr_scheduler_type: linear
  • num_epochs: 20

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.10.1+cu111
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support