Fill-Mask
Transformers
PyTorch
English
bert
biology
medical

rttl-ai/BIOptimus v.0.4

Model Details

Model Description: BIOptimus v.0.4 model is a BERT-like model pre-trained on PubMed abstracts. It is a biomedical language model pre-trained using contextualized weight distillation and Curriculum Learning. This model achieves state-of-the-art performance on several biomedical NER datasets from BLURB benchmark.

  • Developed by: rttl-ai
  • Model Type: Language model
  • Language(s): English
  • License: Apache-2.0
  • Resources for more information:
  • It is introduced in the paper BIOptimus: Pre-training an Optimal Biomedical Language Model with Curriculum Learning for Named Entity Recognition (BioNLP workshop @ ACL 2023).
  • arxiv
  • arxiv
  • More information is available in this repository.
Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train rttl-ai/BIOptimus