Mahalingam
commited on
Commit
•
dda7d01
1
Parent(s):
28a9e40
Update README.md
Browse files
README.md
CHANGED
@@ -28,3 +28,60 @@ widget:
|
|
28 |
}
|
29 |
}
|
30 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
}
|
29 |
}
|
30 |
---
|
31 |
+
|
32 |
+
# Medical Summary Generation with BART
|
33 |
+
|
34 |
+
This project involves a DistilBART model for generating medical summaries from input text.
|
35 |
+
The model is trained to understand medical data and produce concise and informative summaries.
|
36 |
+
|
37 |
+
## Table of Contents
|
38 |
+
|
39 |
+
- [Introduction](#introduction)
|
40 |
+
- [Usage](#usage)
|
41 |
+
- [Model Details](#model-details)
|
42 |
+
- [Contact](#contact)
|
43 |
+
|
44 |
+
|
45 |
+
## Introduction
|
46 |
+
|
47 |
+
The DistilBART-Med-Summary Generator is built using the Hugging Face Deep Learning Container and is designed to generate medical summaries from input text. This README provides information on how to use the model, details about the architecture, and where to find downloads.
|
48 |
+
|
49 |
+
## Usage
|
50 |
+
|
51 |
+
To use the model for medical summary generation, follow these steps:
|
52 |
+
|
53 |
+
Install the required dependencies:
|
54 |
+
|
55 |
+
- pip install transformers
|
56 |
+
- pip install torch
|
57 |
+
- pip install datasets
|
58 |
+
|
59 |
+
```python
|
60 |
+
from transformers import pipeline
|
61 |
+
summarizer = pipeline("summarization", model="philschmid/distilbart-cnn-12-6-samsum")
|
62 |
+
|
63 |
+
conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
|
64 |
+
Philipp: Sure you can use the new Hugging Face Deep Learning Container.
|
65 |
+
Jeff: ok.
|
66 |
+
Jeff: and how can I get started?
|
67 |
+
Jeff: where can I find documentation?
|
68 |
+
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
|
69 |
+
'''
|
70 |
+
nlp(conversation)
|
71 |
+
```
|
72 |
+
|
73 |
+
## Model-details
|
74 |
+
|
75 |
+
Model Name: DistilBart-Med-Summary
|
76 |
+
Task: Medical Summary Generation
|
77 |
+
Architecture: DistilBART
|
78 |
+
Training Data: Details about the medical dataset used for training
|
79 |
+
Training Duration: Number of training steps, training time, etc.
|
80 |
+
|
81 |
+
|
82 |
+
## Contact
|
83 |
+
For any inquiries or support related to this model, feel free to contact:
|
84 |
+
|
85 |
+
Name : Mahalingam Balasubramanian
|
86 |
+
|
87 |
+
Email : [email protected]
|