YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

PubMed 200k RCT DeBERTa v3 Model

This model is fine-tuned on the PubMed 200k RCT dataset using the DeBERTa v3 base model.

Model Details

  • Base model: microsoft/deberta-v3-base
  • Fine-tuned on: PubMed 200k RCT dataset
  • Task: Sequence Classification
  • Number of classes: 5
  • Max sequence length: 68

Usage

from transformers import AutoModelForSequenceClassification, AutoTokenizer

model = AutoModelForSequenceClassification.from_pretrained('Vedant101/bert-uncased-pubmed-200k')
tokenizer = AutoTokenizer.from_pretrained('Vedant101/bert-uncased-pubmed-200k')
Downloads last month
5
Safetensors
Model size
184M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.