deberta-med-ner-2

This model is a fine-tuned version of DeBERTaV3 on the PubMED Dataset.

Model description

MED-NER Model was finetuned on BERT to recognize 41 Medical entities.

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 69
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 25
  • mixed_precision_training: Native AMP

Usage

The easiest way is to load the inference api from huggingface and second method is through the pipeline object offered by transformers library.

# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="NeuronZero/MED-NER", aggregation_strategy='simple')

result = pipe('A 48 year-old female presented with vaginal bleeding and abnormal Pap smears.
Upon diagnosis of invasive non-keratinizing SCC of the cervix, she underwent a radical hysterectomy with salpingo-oophorectomy which demonstrated positive spread to the pelvic lymph nodes and the parametrium.
Pathological examination revealed that the tumour also extensively involved the lower uterine segment.')



# Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification

tokenizer = AutoTokenizer.from_pretrained("NeuronZero/MED-NER")
model = AutoModelForTokenClassification.from_pretrained("NeuronZero/MED-NER")
Downloads last month
25
Safetensors
Model size
184M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for NeuronZero/MED-NER

Finetuned
(282)
this model

Space using NeuronZero/MED-NER 1

Collection including NeuronZero/MED-NER