ficsort's picture
Update README.md
c5ff84a
|
raw
history blame
689 Bytes
metadata
language: hu
license: apache-2.0
datasets:
  - wikipedia
tags:
  - generated_from_keras_callback
  - hubert
model-index:
  - name: hubert-tiny-wiki-seq128
    results: []

hubert-tiny-wiki-seq128

This model was trained from scratch on the Wikipedia subset of Hungarian Webcorpus 2.0 with MLM and SOP tasks.

Parameters:

Pre-training steps: 500.000 Sequence length: 128 (the model is capable for 512) Batch size: 1024

Framework versions

  • Transformers 4.21.3
  • TensorFlow 2.10.0
  • Datasets 2.4.0
  • Tokenizers 0.12.1

Acknowledgement

Artificial Intelligence - National Laboratory - Hungary