tmlinhdinh commited on
Commit
80e747e
β€’
1 Parent(s): dc89a4d

clean up git messages

Browse files
Files changed (3) hide show
  1. .gitattributes +35 -0
  2. README.md +5 -112
  3. app.py +20 -11
.gitattributes ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,118 +1,11 @@
1
  ---
2
- title: DeployPythonicRAG
3
- emoji: πŸ“‰
4
- colorFrom: blue
5
- colorTo: purple
6
  sdk: docker
7
  pinned: false
8
  license: apache-2.0
9
  ---
10
 
11
- # Deploying Pythonic Chat With Your Text File Application
12
-
13
- In today's breakout rooms, we will be following the processed that you saw during the challenge - for reference, the instructions for that are available [here](https://github.com/AI-Maker-Space/Beyond-ChatGPT/tree/main).
14
-
15
- Today, we will repeat the same process - but powered by our Pythonic RAG implementation we created last week.
16
-
17
- You'll notice a few differences in the `app.py` logic - as well as a few changes to the `aimakerspace` package to get things working smoothly with Chainlit.
18
-
19
- ## Reference Diagram (It's Busy, but it works)
20
-
21
- ![image](https://i.imgur.com/IaEVZG2.png)
22
-
23
- ## Deploying the Application to Hugging Face Space
24
-
25
- Due to the way the repository is created - it should be straightforward to deploy this to a Hugging Face Space!
26
-
27
- > NOTE: If you wish to go through the local deployments using `chainlit run app.py` and Docker - please feel free to do so!
28
-
29
- <details>
30
- <summary>Creating a Hugging Face Space</summary>
31
-
32
- 1. Navigate to the `Spaces` tab.
33
-
34
- ![image](https://i.imgur.com/aSMlX2T.png)
35
-
36
- 2. Click on `Create new Space`
37
-
38
- ![image](https://i.imgur.com/YaSSy5p.png)
39
-
40
- 3. Create the Space by providing values in the form. Make sure you've selected "Docker" as your Space SDK.
41
-
42
- ![image](https://i.imgur.com/6h9CgH6.png)
43
-
44
- </details>
45
-
46
- <details>
47
- <summary>Adding this Repository to the Newly Created Space</summary>
48
-
49
- 1. Collect the SSH address from the newly created Space.
50
-
51
- ![image](https://i.imgur.com/Oag0m8E.png)
52
-
53
- > NOTE: The address is the component that starts with `[email protected]:spaces/`.
54
-
55
- 2. Use the command:
56
-
57
- ```bash
58
- git remote add hf HF_SPACE_SSH_ADDRESS_HERE
59
- ```
60
-
61
- 3. Use the command:
62
-
63
- ```bash
64
- git pull hf main --no-rebase --allow-unrelated-histories -X ours
65
- ```
66
-
67
- 4. Use the command:
68
-
69
- ```bash
70
- git add .
71
- ```
72
-
73
- 5. Use the command:
74
-
75
- ```bash
76
- git commit -m "Deploying Pythonic RAG"
77
- ```
78
-
79
- 6. Use the command:
80
-
81
- ```bash
82
- git push hf main
83
- ```
84
-
85
- 7. The Space should automatically build as soon as the push is completed!
86
-
87
- > NOTE: The build will fail before you complete the following steps!
88
-
89
- </details>
90
-
91
- <details>
92
- <summary>Adding OpenAI Secrets to the Space</summary>
93
-
94
- 1. Navigate to your Space settings.
95
-
96
- ![image](https://i.imgur.com/zh0a2By.png)
97
-
98
- 2. Navigate to `Variables and secrets` on the Settings page and click `New secret`:
99
-
100
- ![image](https://i.imgur.com/g2KlZdz.png)
101
-
102
- 3. In the `Name` field - input `OPENAI_API_KEY` in the `Value (private)` field, put your OpenAI API Key.
103
-
104
- ![image](https://i.imgur.com/eFcZ8U3.png)
105
-
106
- 4. The Space will begin rebuilding!
107
-
108
- </details>
109
-
110
- ## πŸŽ‰
111
-
112
- You just deployed Pythonic RAG!
113
-
114
- Try uploading a text file and asking some questions!
115
-
116
- ## 🚧CHALLENGE MODE 🚧
117
-
118
- For more of a challenge, please reference [Building a Chainlit App](./BuildingAChainlitApp.md)!
 
1
  ---
2
+ title: AI Insightsbot
3
+ emoji: 🐠
4
+ colorFrom: indigo
5
+ colorTo: gray
6
  sdk: docker
7
  pinned: false
8
  license: apache-2.0
9
  ---
10
 
11
+ Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app.py CHANGED
@@ -63,22 +63,31 @@ rag_documents_2 = PyMuPDFLoader(file_path=DATA_LINK2).load()
63
  chunked_rag_documents = chunk_documents(rag_documents_1, CHUNK_SIZE, CHUNK_OVERLAP) + \
64
  chunk_documents(rag_documents_2, CHUNK_SIZE, CHUNK_OVERLAP)
65
 
66
- embeddings = OpenAIEmbeddings(model=EMBEDDING_MODEL)
67
- retriever = build_retriever(chunked_rag_documents, embeddings, COLLECTION_NAME)
68
 
69
- rag_prompt = ChatPromptTemplate.from_template(RAG_PROMPT)
70
- qa_llm = ChatOpenAI(model=QA_MODEL)
 
 
 
 
 
 
 
 
 
 
 
 
 
71
 
72
- rag_chain = (
73
- {"context": itemgetter("question") | retriever, "question": itemgetter("question")}
74
- | rag_prompt | llm | StrOutputParser()
75
- )
76
 
77
  # Chainlit app
78
  @cl.on_message
79
- async def main(message: str):
80
- response = rag_chain.invoke({"question": message})
 
 
81
  await cl.Message(
82
- content=response["response"], # Extract the response from the chain
83
  author="AI"
84
  ).send()
 
63
  chunked_rag_documents = chunk_documents(rag_documents_1, CHUNK_SIZE, CHUNK_OVERLAP) + \
64
  chunk_documents(rag_documents_2, CHUNK_SIZE, CHUNK_OVERLAP)
65
 
 
 
66
 
67
+ @cl.on_chat_start
68
+ async def on_chat_start():
69
+ embeddings = OpenAIEmbeddings(model=EMBEDDING_MODEL)
70
+ retriever = build_retriever(chunked_rag_documents, embeddings, COLLECTION_NAME)
71
+
72
+ rag_prompt = ChatPromptTemplate.from_template(RAG_PROMPT)
73
+ qa_llm = ChatOpenAI(model=QA_MODEL)
74
+
75
+ rag_chain = (
76
+ {"context": itemgetter("question") | retriever, "question": itemgetter("question")}
77
+ | rag_prompt | qa_llm | StrOutputParser()
78
+ )
79
+
80
+ cl.user_session.set("chain", rag_chain)
81
+
82
 
 
 
 
 
83
 
84
  # Chainlit app
85
  @cl.on_message
86
+ async def main(message):
87
+ chain = cl.user_session.get("chain")
88
+ result = chain.invoke({"question" : message.content})
89
+
90
  await cl.Message(
91
+ content=result, # Extract the response from the chain
92
  author="AI"
93
  ).send()