Spaces:
Running
Running
Canstralian
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -9,96 +9,71 @@ app_file: app.py
|
|
9 |
pinned: false
|
10 |
---
|
11 |
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
[![
|
17 |
-
[![
|
18 |
-
[![
|
19 |
-
[![
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
git clone https://github.com/yourusername/nlp-fastapi-deployment.git
|
42 |
-
cd nlp-fastapi-deployment
|
43 |
-
Set Up a Virtual Environment:
|
44 |
-
|
45 |
-
bash
|
46 |
-
Copy code
|
47 |
-
python -m venv venv
|
48 |
-
source venv/bin/activate # On Windows: venv\Scripts\activate
|
49 |
-
Install Dependencies:
|
50 |
-
|
51 |
-
bash
|
52 |
-
Copy code
|
53 |
pip install -r requirements.txt
|
|
|
54 |
Usage
|
55 |
-
|
56 |
|
57 |
-
|
58 |
-
|
59 |
-
uvicorn main:app --reload
|
60 |
-
The API will be accessible at http://127.0.0.1:8000.
|
61 |
|
62 |
-
|
63 |
|
64 |
-
|
65 |
|
66 |
-
|
|
|
|
|
|
|
|
|
|
|
67 |
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
- Uvicorn: ASGI server for running FastAPI applications.
|
89 |
-
|
90 |
-
**Ensure all dependencies are listed in requirements.txt for easy installation.**
|
91 |
-
|
92 |
-
## Contributing
|
93 |
-
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
|
94 |
-
|
95 |
-
## License
|
96 |
-
This project is licensed under the MIT License. See the LICENSE file for details.
|
97 |
-
|
98 |
-
## Acknowledgements
|
99 |
-
- Hugging Face for providing accessible NLP models.
|
100 |
-
- FastAPI for the high-performance API framework.
|
101 |
-
**For a visual guide on creating a deep learning API with FastAPI, you might find the following resource helpful:**
|
102 |
-
https://youtu.be/NrarIs9n24I
|
103 |
-
|
104 |
-
An example chatbot using [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index).
|
|
|
9 |
pinned: false
|
10 |
---
|
11 |
|
12 |
+
Here is a sample README.md with GitHub badges:
|
13 |
+
|
14 |
+
# NLPToolkit Agent - Gradio Chatbot
|
15 |
+
|
16 |
+
[![Python](https://img.shields.io/badge/python-%3E%3D%203.7-blue)](https://www.python.org/)
|
17 |
+
[![Gradio](https://img.shields.io/badge/Gradio-v3.27.0-orange)](https://gradio.app/)
|
18 |
+
[![Hugging Face](https://img.shields.io/badge/Hugging%20Face-open--assistant--oasst--sft--4--pythia--12b--epoch--3.5-green)](https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5)
|
19 |
+
[![License](https://img.shields.io/badge/license-MIT-blue)](LICENSE)
|
20 |
+
|
21 |
+
NLPToolkit Agent is a powerful NLP chatbot powered by Hugging Face and Gradio, designed to assist with a variety of NLP tasks including summarization, sentiment analysis, text classification, and entity recognition. It allows users to interact with the model dynamically and adjust parameters like temperature, max tokens, and top-p for better control over the output.
|
22 |
+
|
23 |
+
## Features
|
24 |
+
- **Multiple NLP Tasks**: Perform tasks like summarization, sentiment analysis, text classification, and entity recognition.
|
25 |
+
- **Dynamic Task Selection**: Choose your desired NLP task from a dropdown menu.
|
26 |
+
- **Chat History**: Save and load conversation history for reference or further interaction.
|
27 |
+
- **Language Detection**: Preprocess user input for language validation and display error messages if the input language is unsupported.
|
28 |
+
- **Model Parameters Control**: Fine-tune the model's behavior using sliders for max tokens, temperature, and top-p (nucleus sampling).
|
29 |
+
- **Powered by Hugging Face and Gradio**: Utilize state-of-the-art models hosted on Hugging Face with an intuitive Gradio interface.
|
30 |
+
|
31 |
+
## Requirements
|
32 |
+
|
33 |
+
- Python 3.7 or higher
|
34 |
+
- Gradio 3.27.0
|
35 |
+
- Hugging Face `huggingface_hub` for model inference
|
36 |
+
- `langdetect` for language detection
|
37 |
+
|
38 |
+
Install the dependencies by running:
|
39 |
+
|
40 |
+
```bash
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
pip install -r requirements.txt
|
42 |
+
''''
|
43 |
Usage
|
44 |
+
1. Clone the repository:
|
45 |
|
46 |
+
git clone https://github.com/yourusername/nlptoolkit-agent.git
|
47 |
+
cd nlptoolkit-agent
|
|
|
|
|
48 |
|
49 |
+
2. Run the application:
|
50 |
|
51 |
+
python app.py
|
52 |
|
53 |
+
This will start the Gradio interface, which you can access via your browser at http://localhost:7860.
|
54 |
+
3. Interact with the Chatbot:
|
55 |
+
• Select an NLP task (Summarization, Sentiment Analysis, etc.).
|
56 |
+
• Type your message in the input field and click “Generate Response.”
|
57 |
+
• Adjust parameters (max tokens, temperature, top-p) for more fine-tuned responses.
|
58 |
+
• Save and load chat history using the respective buttons.
|
59 |
|
60 |
+
Model Used
|
61 |
+
|
62 |
+
The application uses the Hugging Face model OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5. You can modify the model to any other compatible model from Hugging Face if desired.
|
63 |
+
|
64 |
+
License
|
65 |
+
|
66 |
+
This project is licensed under the MIT License - see the LICENSE file for details.
|
67 |
+
|
68 |
+
Contributing
|
69 |
+
|
70 |
+
Feel free to fork this repository and submit pull requests. Contributions are welcome!
|
71 |
+
|
72 |
+
Acknowledgments
|
73 |
+
• Hugging Face for providing the pre-trained NLP models.
|
74 |
+
• Gradio for the amazing tool to create user interfaces for machine learning models.
|
75 |
+
• OpenAssistant for their contribution to the model.
|
76 |
+
|
77 |
+
Developed by Canstralian
|
78 |
+
|
79 |
+
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|