Fill-Mask
Transformers
PyTorch
Spanish
bert
masked-lm
Inference Endpoints

Tulio

Tulio is a BERT model trained with Chilean Spanish. This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on Spanish Books and Small Chilean Spanish Corpus.

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 20
  • eval_batch_size: 20
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 2
  • total_train_batch_size: 20
  • total_eval_batch_size: 20
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2.0

Acknowledgments

We are grateful for the servers provided by the Computer Science Department of the University of Chile and the ReLeLa (Representations for Learning and Language) study group for the training of the model.

License Disclaimer

The license gpl-3.0 best describes our intentions for our work. However we are not sure that all the datasets used to train the model have licenses compatible with gpl-3.0. Please use at your own discretion and verify that the licenses of the original text resources match your needs.

Limitations

The training dataset was not censored in any way. Therefore, the model may contain unwanted ideological representations. Use with caution.

Downloads last month
16
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Datasets used to train jorgeortizfuentes/tulio-bert