metadata
tags:
- text-generation
- 8bit
- 8-bit
- quantization
- compression
- chatbot
- dialogue
- conversation
datasets:
- kilt_tasks
inference: false
license: apache-2.0
ethzanalytics/gpt-j-8bit-KILT_WoW_10k_steps
This is a version of hivemind/gpt-j-6B-8bit
fine-tuned on the Wizard of Wikipedia dataset for 10k steps (just under an epoch) on an A100. it can be used as a chatbot. It is designed to be used with ai-msgbot to take advantage of the prompt engineering.
Usage
NOTE: this needs to be loaded via the special patching technique outlined in the hivemind model card (as with all 8bit models)
Examples of how to load the model correctly are already in place in the notebook linked above. A .py
of said notebook was uploaded to the repo for reference - link here
Training
For details, please see this wandb report for both the daily-dialogues version and the WoW version.