gpt2-fantasy

This model is a fine-tuned version of gpt2 on IMDB fantasy synopsis dataset.

Model description

This model was fine-tuned with intention of generating short fantasy stories based on given keywords. You can test this model here.

Training data

Training data was parsed from IMDB website and consists of keywords-synopsis pairs. Method of encoding data was inspired from this repo.

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: Adam
  • dropout: 0.2
  • learning schedule: exponential decay
  • epochs: 4

Training results

Keywords: time travel, magic, rescue.
Output: A group of young men find themselves in the middle of a desert, and must band together to save their friend from an evil force bent on destroying his world. But what if they don't trust each other? Is there something that might help them at all costs? Can they break the spell before it's too late? Will they be able to stop this Evil once and for all? Or will they do anything about it? Above all, can they make things right with humanity? And above all, is there some really good magic going on here? What are they trying to do?

Framework versions

  • Transformers 4.29.2
  • TensorFlow 2.12.0
  • Tokenizers 0.13.3
Downloads last month
22
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for egosumkira/gpt2-fantasy

Finetuned
(1264)
this model

Space using egosumkira/gpt2-fantasy 1