hubert-medium-wiki-seq128

Fully trained model with the second phase of training is available here: SzegedAI/hubert-medium-wiki

This model was trained from scratch on the Wikipedia subset of Hungarian Webcorpus 2.0 with MLM and SOP tasks.

Pre-Training Parameters:

  • Training steps: 500.000
  • Sequence length: 128 (the model is capable for 512)
  • Batch size: 1024

Framework versions

  • Transformers 4.21.3
  • TensorFlow 2.10.0
  • Datasets 2.4.0
  • Tokenizers 0.12.1

Acknowledgement

Artificial Intelligence - National Laboratory - Hungary

Downloads last month
9
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train SzegedAI/hubertusz-medium-wiki-seq128