File size: 1,721 Bytes
7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 31369d9 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c 7a4d908 99d8a3c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 |
---
library_name: peft
base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
datasets:
- ruslanmv/ai-medical-chatbot
---
# Model Card for Medical-Mixtral-7B-v1.5k
## Model Details
### Model Description
The Medical-Mixtral-7B-v1.5k is a fine-tuned Mixtral model for answering medical assistance questions. This model is a novel version of mistralai/Mixtral-8x7B-Instruct-v0.1, adapted to a subset of 1.5k records from the AI Medical Chatbot dataset, which contains 250k records. The purpose of this model is to provide a ready chatbot to answer questions related to medical assistance.
### Model Sources [optional]
## How to Get Started with the Model
Use the code below to get started with the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
# Define the name of your fine-tuned model
finetuned_model = 'ruslanmv/Medical-Mixtral-7B-v1.5k'
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained(finetuned_model, trust_remote_code=True)
# Load the model with the provided adapter configuration and weights
model_pretrained = AutoModelForCausalLM.from_pretrained(finetuned_model, trust_remote_code=True, torch_dtype=torch.float16)
messages = [
{'role': 'user', 'content': 'What should I do to reduce my weight gained due to genetic hypothyroidism?'},
{'role': 'assistant', 'content': ''},
]
input_ids = tokenizer.apply_chat_template(messages, return_tensors='pt').to('cuda')
outputs = model_pretrained.generate(input_ids, max_new_tokens=500)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
### Framework versions
- PEFT 0.10.0
Please fill in the missing parts with the relevant information for your model. Let me know if you need further assistance! |