--- language: hu license: apache-2.0 datasets: - wikipedia tags: - generated_from_keras_callback - hubert model-index: - name: hubert-tiny-wiki-seq128 results: [] --- # hubert-tiny-wiki-seq128 Fully trained model with the second phase of training is available here: [SzegedAI/hubert-tiny-wiki](https://huggingface.co/SzegedAI/hubert-tiny-wiki) This model was trained from scratch on the Wikipedia subset of Hungarian Webcorpus 2.0 with MLM and SOP tasks. ### Pre-Training Parameters: - Training steps: 500.000 - Sequence length: 128 (the model is capable for 512) - Batch size: 1024 ### Framework versions - Transformers 4.21.3 - TensorFlow 2.10.0 - Datasets 2.4.0 - Tokenizers 0.12.1 # Acknowledgement [![Artificial Intelligence - National Laboratory - Hungary](https://milab.tk.hu/uploads/images/milab_logo_en.png)](https://mi.nemzetilabor.hu/)