--- title: Llama2-MedTuned-7b emoji: 🧬 colorFrom: blue colorTo: green sdk: static pinned: false license: apache-2.0 tags: - biomedical - clinical - medical --- # Model Description Llama2-MedTuned-7b is an instruction-tuned version of the Llama2 7B model, specifically adapted for biomedical language processing tasks. It has been fine-tuned on a dataset consisting of approximately 200,000 instruction-focused samples, covering a range of biomedical and clinical NLP tasks such as Named Entity Recognition (NER), Relation Extraction (RE), and Medical Natural Language Inference (NLI). # Instruction Tuning Procedure This model underwent instruction tuning, a process where the model is fine-tuned with detailed instructions to enhance its ability to interpret and execute specific tasks in the biomedical domain. The tuning involved the use of a comprehensive instruction-based dataset, tailor-made to align with the requirements of biomedical NLP tasks. # Model Capabilities Llama2-MedTuned-7b demonstrates an enhanced understanding of biomedical contexts, effectively handling NER, RE, and NLI tasks. It showcases improved accuracy in generating structured outputs suitable for evaluation using conventional metrics. # Architecture The architecture of Llama2-MedTuned-7b is based on the autoregressive transformer model Llama2 7B. This model maintains the original transformer layers and attention mechanisms, specifically adjusted to cater to the linguistic intricacies of the biomedical field. # Citation If you utilise Llama2-MedTuned-7b in your research or application, please consider citing our paper: ```bibtex @article{rohanian2024exploring, title = {Exploring the Effectiveness of Instruction Tuning in Biomedical Language Processing}, author = {Rohanian, Omid and Nouriborji, Mohammadmahdi and Kouchaki, Samaneh and Nooralahzadeh, Farhad and Clifton, Lei and Clifton, David A}, journal = {Artificial Intelligence in Medicine}, volume = {158}, pages = {103007}, year = {2024}, publisher = {Elsevier}, doi = {10.1016/j.artmed.2024.103007}, url = {https://www.sciencedirect.com/science/article/pii/S0933365724002495}, issn = {0933-3657} } ```