Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
@@ -96,7 +96,11 @@ Scaling: This platform is ideal for prototyping and small-scale applications, bu
|
|
96 |
#Hugging Face Inference API:
|
97 |
|
98 |
Using the Hugging Face Inference API to host models like mT5-small. This is a straightforward way to make API calls to the model for generation tasks. If you want to integrate a retriever with this API-based system, you would need to build that part separately (e.g., using an external document store or retriever).
|
99 |
-
|
|
|
|
|
|
|
|
|
100 |
**DATASETS**
|
101 |
|
102 |
We have three datasets available to researchers working on Early Modern English in the late C16th and
|
|
|
96 |
#Hugging Face Inference API:
|
97 |
|
98 |
Using the Hugging Face Inference API to host models like mT5-small. This is a straightforward way to make API calls to the model for generation tasks. If you want to integrate a retriever with this API-based system, you would need to build that part separately (e.g., using an external document store or retriever).
|
99 |
+
|
100 |
+
#Running mT5 on Hugging Face:
|
101 |
+
GPU Access: Hugging Face Spaces allows you to use GPU instances, which are essential for efficiently running mT5-small, particularly for handling the retrieval and generation tasks in a RAG pipeline.
|
102 |
+
Integration: You can deploy the mT5-small model as part of the pipeline on Hugging Face Spaces. You’ll need to ensure the retriever (e.g., BM25 or FAISS) is integrated into the system, and it should return results to the mT5 model for the generation step.
|
103 |
+
|
104 |
**DATASETS**
|
105 |
|
106 |
We have three datasets available to researchers working on Early Modern English in the late C16th and
|