File size: 1,048 Bytes
f80b305
 
fa4355d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f80b305
fa4355d
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: mit
datasets:
- chloeliu/reddit_nosleep_posts
language:
- en
tags:
- fun
- horror
- writing
widget:
- text: "[WP] We don't go to ravenholm anymore [RESPONSE] "
  example_title: "[WP] We don't go to ravenholm anymore [RESPONSE] "
co2_eq_emissions:
  emissions: 60
  source: "https://mlco2.github.io/impact/#compute"
  training_type: "fine-tuning"
  geographical_location: "Oregon, USA"
  hardware_used: "1 T4, Google Colab"
---

# GPT-NoSleep-355m
A finetuned version of [GPT2-Medium](https://huggingface.co/gpt2-medium) on the 'reddit-nosleep-posts' dataset. (Linked above)

# Training Procedure
This was trained on the 'reddt-nosleep-posts' dataset, using the "HappyTransformers" library on Google Colab.
This model was trained for X epochs with learning rate 1e-2.

# Biases & Limitations
This likely contains the same biases and limitations as the original GPT2 that it is based on, and additionally heavy biases from the dataset.
It likely will generate offensive output. 

# Intended Use
This model is meant for fun, nothing else.