File size: 491 Bytes
c7acc46
b3e3dea
f037106
 
 
 
 
743779e
 
2c07ef5
743779e
 
b3e3dea
 
b35e684
 
c972669
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: mit
datasets:
- wikitext
language:
- en
library_name: transformers
metrics:
- accuracy
pipeline_tag: text-generation
tags:
- general
- history
- business
---

Quantized GPT2 model.

Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on BookCorpus, a dataset of over 7,000 unpublished fiction books from various genres, and trained on a dataset of 8 million web pages.