metadata
license: mit
datasets:
- wikitext
language:
- en
library_name: transformers
metrics:
- accuracy
pipeline_tag: text-generation
tags:
- general
- history
- business
Quantized GPT2 model.
Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on BookCorpus, a dataset of over 7,000 unpublished fiction books from various genres, and trained on a dataset of 8 million web pages.