Distil GPT

This is a small version of Generative Pre-trained Transformer 2 (GPT-2), pretrained on 10 GB of Pakistan's Legal Corpus to generate Legal text developed by AI Systems using Causal Language Modelling.

Reference:

This model was orginally taken from "distilGPT2" developed by HuggingFace (https://huggingface.co/distilgpt2)

Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support