Edit model card

MahaGPT

MahaGPT is a Marathi GPT2 model. It is a GPT2 model pre-trained on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets. [dataset link] (https://github.com/l3cube-pune/MarathiNLP)

More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)

@article{joshi2022l3cube,
  title={L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources},
  author={Joshi, Raviraj},
  journal={arXiv preprint arXiv:2202.01159},
  year={2022}
}
Downloads last month
59
Safetensors
Model size
137M params
Tensor type
F32
·
U8
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.