--- base_model: unsloth/mistral-nemo-instruct-2407-bnb-4bit language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - mistral - trl datasets: - exafluence/Open-MedQA-Nexus --- # EXF-Medistral-Nemo-12B ## Model Description **EXF-Medistral-Nemo-12B** is a fine-tuned version of the **Mistral-Nemo-12B** model, optimized for tasks in the medical domain. It has been trained on the **Open-Nexus-MedQA** dataset, which integrates a wide range of medical knowledge from public datasets like **ChatDoctor**, **icliniq**, and others, to enhance the model’s ability to answer medical questions accurately and reliably. This model is designed to assist in clinical decision support, medical coding, and patient care by generating responses based on comprehensive medical knowledge. ## Model Architecture - **Base Model**: Mistral-Nemo-12B - **Parameters**: 12 billion - **Fine-tuning Dataset**: Open-Nexus-MedQA - **Task**: Medical question-answering (QA), medical coding, and healthcare information retrieval. ## Training Data The model was fine-tuned on the **Open-Nexus-MedQA** dataset, which aggregates data from multiple medical QA sources such as: - **ChatDoctor** - **icliniq.com** - **HealthCareMagic** - **CareQA** - **MedInstruct** The dataset contains medical queries ranging from simple conditions to complex diagnoses, accompanied by accurate, domain-specific responses, making it a robust training source for real-world medical applications. ## Intended Use **EXF-Medistral-Nemo-12B** is ideal for: - **Medical Question-Answering**: It can be used for generating responses to patient queries or supporting healthcare professionals with clinical information. - **Medical Coding**: The model supports tasks related to **CMS**, **OASIS**, **ICD-10**, and other coding systems. - **Clinical Decision Support**: Assisting doctors and healthcare providers by offering evidence-based suggestions or answers. - **Patient Care Tools**: Powering medical chatbots or virtual assistants for patients seeking health information. ## Performance The model has been fine-tuned for precision in the medical domain, demonstrating high accuracy in understanding and generating responses to complex medical queries. It excels in: - **Medical terminology comprehension** - **Providing accurate ICD-10 and CMS codes** - **Generating medically relevant and safe answers** ## Limitations - **Not a Diagnostic Tool**: This model is not intended to replace medical professionals or provide definitive medical diagnoses. Always consult with a licensed healthcare provider for medical advice. - **Training Data Bias**: The dataset is based on publicly available medical QA data, which might not cover all edge cases or international healthcare systems. ## How to Use ```python from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("exafluence/EXF-Medistral-Nemo-12B") model = AutoModelForCausalLM.from_pretrained("exafluence/EXF-Medistral-Nemo-12B") input_text = "What are the symptoms of type 2 diabetes?" inputs = tokenizer(input_text, return_tensors="pt") outputs = model.generate(**inputs) print(tokenizer.decode(outputs, skip_special_tokens=True)) ``` ## License This model is provided under a proprietary license. Usage is restricted to non-commercial purposes unless explicit permission is granted. Citation If you use this model, please cite: ```bibtex @inproceedings{exafluence2024EXFMedistralNemo12B, title={EXF-Medistral-Nemo-12B: A Fine-Tuned Medical Language Model for Healthcare Applications}, author={Exafluence Inc.}, year={2024}, url={https://huggingface.co/exafluence/EXF-Medistral-Nemo-12B} doi={https://doi.org/10.57967/hf/3284} } ``` ## Contact For any questions or inquiries regarding usage, licensing, or access, please contact Exafluence Inc.. # Uploaded model - **Developed by:** exafluence - **License:** apache-2.0 - **Finetuned from model :** unsloth/mistral-nemo-instruct-2407-bnb-4bit This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.