File size: 1,891 Bytes
136fe04
 
 
 
 
 
 
 
 
 
995b32e
 
95ccff4
 
 
 
 
 
 
 
3eda305
95ccff4
 
995b32e
 
74b490c
995b32e
 
 
 
 
74b490c
3f7875c
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
---
license: apache-2.0
language:
- en
base_model:
- mistralai/Mistral-7B-v0.1
pipeline_tag: text-generation
tags:
- clinical trial
- foundation model
---

# Model Card for Panacea-7B-Chat
The Panacea-7B-Chat is a foundation model for clinical trial search, summarization, design, and recruitment. It was equipped with clinical knowledge by being trained on 793,279 clinical trial design documents
worldwide and 1,113,207 clinical study papers. It shows superior performances than various open-sourced LLMs and medical LLMs on clinical trial tasks.

For full details of this model please read our [paper](https://arxiv.org/abs/2407.11007).

## Model Training
Panacea is trained from [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1). The training of Panacea consists of an alignment step and an instruction-tuning step.
* Alignment step: continued pre-training on a large collection of trial documents and trial-related scientific papers. This step adapts Panacea to the vocabulary commonly used in clinical trials.
* Instruction-tuning step: further enables Panacea to comprehend the user explanation of the task definition and the output requirement.

Load the model in the following way (same as Mistral):

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model_id = 'linjc16/Panacea-7B-Chat'

model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)
```

## Citation
If you find our paper or models helpful, please consider cite as follows:

```bibtex
@article{lin2024panacea,
  title={Panacea: A foundation model for clinical trial search, summarization, design, and recruitment},
  author={Lin, Jiacheng and Xu, Hanwen and Wang, Zifeng and Wang, Sheng and Sun, Jimeng},
  journal={arXiv preprint arXiv:2407.11007},
  year={2024}
}
```