GPTagalog / README.md
baebee's picture
Update README.md
0107a13
|
raw
history blame
1.81 kB
metadata
license: openrail
language:
  - tl
tags:
  - langauge
  - gpt
  - remake
  - v2
  - pytorch
  - pickle
  - gpt2
  - open sourced
pipeline_tag: text-generation

Colab used to train this model ๐Ÿ‘‰๐Ÿ‘‰ gpt remaker Both training and inferencing are included in the colab. Happy coding!

Model Information

  • Model Name: GPTagalog
  • Version: 2
  • Training Iterations: 143,000
  • Learning Rate: 6e-4
  • Language: Tagalog
  • Compatibility: Pickle (pkl) format (cuda)
  • Model Size: 30MB
  • Training Time: Approx 2 hours and 30 minutes
  • Usage: Experimental, not suitable for commercial purposes

Model Description

This was designed to explore the capabilities of training a language model on a small dataset and to see how it performs in generating text in the Tagalog language.

Training Details

Iterations and Epochs: GPTagalog was trained for 143,000 iterations over 60 epochs. This extended training period aimed to refine its language generation abilities.

Learning Rate: The model was trained with a learning rate of 6e-4, which was chosen to optimize learning and convergence.

Model Size: GPTagalog is relatively small with a file size of 30MB. This small size is due to its experimental nature and limited resources.

Usage Guidelines

Experimental Use: GPTagalog Version 2 is an experimental model and is not recommended for commercial purposes. It may have limitations in generating coherent and contextually accurate text.

Resource Constraints: Due to resource limitations, the model's training was limited to 143,000 iterations and a maximum training time of 6 hours. This is considerably shorter than the training duration of larger models like GPT-2, which has 143 million parameters and takes several days to train.