distilbert-maccrobat

This model is a fine-tuned version of distilbert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7475
  • Precision: 0.5070
  • Recall: 0.5725
  • F1: 0.5377
  • Accuracy: 0.7903
Downloads last month
90
Safetensors
Model size
65.3M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for SahuH/distilbert-maccrobat

Finetuned
(239)
this model