Update README.md
Browse files
README.md
CHANGED
@@ -16,11 +16,37 @@ Deep learning models have shown remarkable performance in electrocardiogram (ECG
|
|
16 |
|
17 |
## Models
|
18 |
This repository contains:
|
|
|
19 |
- SMALL/BASE/LARGE HuBERTECG model sizes fine-tuned on Cardio-Learning for a more disease-oriented baseline to futher fine-tune.
|
20 |
|
21 |
Cardio-Learning is the name we gave to the union of several 12-lead ECG datasets including PTB, PTB-XL, CPSC, CPSC-Extra, Georgia, Chapman, Ningbo, SPH, CODE, SaMi-Trop, Hefei.
|
22 |
This dataset, counting 2.4 million ECGs from millions of patients in 4 countries, encompasses 164 different heart-related conditions for which the ECG is either the primary or a supportive diagnostic tool, or is used to estimate the risk of future adverse cardiovascular events.
|
23 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
## 📚 Citation
|
25 |
If you use our models or find our work useful, please consider citing us:
|
26 |
```
|
|
|
16 |
|
17 |
## Models
|
18 |
This repository contains:
|
19 |
+
- SMALL/BASE/LARGE HuBERTECG model sizes ready to be fine-tuned on any downstream dataset or to be used as feature extractor
|
20 |
- SMALL/BASE/LARGE HuBERTECG model sizes fine-tuned on Cardio-Learning for a more disease-oriented baseline to futher fine-tune.
|
21 |
|
22 |
Cardio-Learning is the name we gave to the union of several 12-lead ECG datasets including PTB, PTB-XL, CPSC, CPSC-Extra, Georgia, Chapman, Ningbo, SPH, CODE, SaMi-Trop, Hefei.
|
23 |
This dataset, counting 2.4 million ECGs from millions of patients in 4 countries, encompasses 164 different heart-related conditions for which the ECG is either the primary or a supportive diagnostic tool, or is used to estimate the risk of future adverse cardiovascular events.
|
24 |
|
25 |
+
## Usage
|
26 |
+
```
|
27 |
+
import torch
|
28 |
+
from hubert_ecg import HuBERTECG
|
29 |
+
|
30 |
+
path = "path/to/your/hubert-ecg-model.pt"
|
31 |
+
checkpoint = torch.load(path, map_location='cpu')
|
32 |
+
config = checkpoint['model_config']
|
33 |
+
hubert_ecg = HuBERTECG(config)
|
34 |
+
hubert_ecg.load_state_dict(checkpoint['model_state_dict']) # pre-trained model ready to be fine-tuned or used as feature extractor
|
35 |
+
```
|
36 |
+
|
37 |
+
```
|
38 |
+
import torch
|
39 |
+
from hubert_ecg import HuBERTECG
|
40 |
+
from hubert_ecg_classification import HuBERTForECGClassification
|
41 |
+
|
42 |
+
path = "path/to/your/finetuned-hubert-ecg-model.pt"
|
43 |
+
checkpoint = torch.load(path, map_location='cpu')
|
44 |
+
config = checkpoint['model_config']
|
45 |
+
hubert_ecg = HuBERTECG(config)
|
46 |
+
hubert_ecg = HuBERTForECGClassification(hubert)
|
47 |
+
hubert_ecg.load_state_dict(checkpoint['model_state_dict']) # fine-tuned model ready to be used or further fine-tuned
|
48 |
+
```
|
49 |
+
|
50 |
## 📚 Citation
|
51 |
If you use our models or find our work useful, please consider citing us:
|
52 |
```
|