--- license: apache-2.0 language: - es pipeline_tag: relation-classification tags: - sentence-transformers - relation-classification - bert - biomedical - lexical semantics - bionlp --- # Biomedical relation classifier with Transformers in Spanish ## Table of contents
Click to expand - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Training](#training) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Author](#author) - [Licensing information](#licensing-information) - [Citation information](#citation-information) - [Disclaimer](#disclaimer)
## Model description This is a Transformer's [AutoModelForSequenceClassification](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoModelForSequenceClassification) trained for biomedical text pairs classification in Spanish. ## Intended uses and limitations The model is prepared to classify hierarchical relations among medical terms. This includes the following types of relations: BROAD, EXACT, NARROW, NO_RELATION. ## How to use This model is implemented as part of the KeyCARE library. Install first the keycare module to call the Transformer classifier: ```bash python -m pip install keycare ``` You can then run the KeyCARE pipeline that uses the Transformer model: ```python from keycare install RelExtractor.RelExtractor # initialize the termextractor object relextractor = RelExtractor() # Run the pipeline source = ["cáncer", "enfermedad de pulmón", "mastectomía radical izquierda", "laparoscopia"] target = ["cáncer de mama", "enfermedad pulmonar", "mastectomía", "Streptococus pneumoniae"] relextractor(source, target) # You can also access the class storing the Transformer model relator = relextractor.relation_method ``` ## Training The used pre-trained model is SapBERT-from-roberta-base-biomedical-clinical-es from the BSC-NLP4BIA reserch group. The model has been trained using the hirerarchical structure of [SNOMED-CT](https://www.snomed.org/) mapped to the medical terms present in [UMLS](https://www.nlm.nih.gov/research/umls/index.html). ## Evaluation To be published ## Additional information ### Author NLP4BIA at the Barcelona Supercomputing Center ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Citation information To be published ### Disclaimer
Click to expand The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.