Dagobert42's picture
Push distilbert-base-uncased trained on biored-train_200_splits.pt (200 samples)
5c63320 verified
metadata
language:
  - en
license: mit
base_model: distilbert-base-uncased
tags:
  - low-resource NER
  - token_classification
  - biomedicine
  - medical NER
  - generated_from_trainer
datasets:
  - medicine
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: Dagobert42/distilbert-base-uncased-biored-finetuned
    results: []

Dagobert42/distilbert-base-uncased-biored-finetuned

This model is a fine-tuned version of distilbert-base-uncased on the bigbio/biored dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6868
  • Accuracy: 0.7768
  • Precision: 0.5392
  • Recall: 0.4561
  • F1: 0.4898
  • Weighted F1: 0.764

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1 Weighted F1
No log 1.0 25 0.9323 0.7124 0.3944 0.1486 0.1309 0.5993
No log 2.0 50 0.8737 0.7248 0.5187 0.2132 0.2341 0.6271
No log 3.0 75 0.8157 0.7353 0.4968 0.2886 0.3314 0.6804
No log 4.0 100 0.7927 0.7452 0.5213 0.3185 0.3686 0.6883
No log 5.0 125 0.7601 0.7507 0.5119 0.3734 0.4161 0.7116
No log 6.0 150 0.7480 0.7555 0.5381 0.3829 0.4285 0.718
No log 7.0 175 0.7393 0.7588 0.5393 0.4031 0.4479 0.7272
No log 8.0 200 0.7342 0.7655 0.5512 0.4143 0.4614 0.7363
No log 9.0 225 0.7391 0.7591 0.5262 0.4425 0.4709 0.7395
No log 10.0 250 0.7264 0.7644 0.5332 0.4539 0.4849 0.7484
No log 11.0 275 0.7350 0.7694 0.5419 0.452 0.4852 0.7483
No log 12.0 300 0.7389 0.77 0.5341 0.4641 0.4921 0.752

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.12.0
  • Tokenizers 0.15.0