Omnibus commited on
Commit
1146df3
·
verified ·
1 Parent(s): cde5b6e

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +14 -3
app.py CHANGED
@@ -11,9 +11,19 @@ cont_list=list(string_json['control'])
11
 
12
  text="""
13
  I asked Generative AI Models about their context window. Their response was intriguing.
14
-
15
  The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum amount of text the model can consider at any one time when generating a response. This includes both the prompt provided by the user and the model’s generated text.
16
-
 
 
 
 
 
 
 
 
 
 
 
17
  In practical terms, the context window limits how much previous dialogue the model can “remember” during an interaction. If the interaction exceeds the context window, the model loses access to the earliest parts of the conversation. This limitation can impact the model’s consistency in long conversations or complex tasks.
18
  """
19
 
@@ -182,6 +192,7 @@ with gr.Blocks() as app:
182
  with gr.Row():
183
  query=gr.Textbox(label="Search query")
184
  search_btn=gr.Button("Search")
 
185
  out_box=gr.Textbox(label="Results")
186
  sen_box=gr.Textbox(label="Sentences")
187
  with gr.Row():
@@ -190,6 +201,6 @@ with gr.Blocks() as app:
190
  with gr.Column(scale=1):
191
  nouns=gr.JSON(label="Nouns")
192
  search_btn.click(find_query,[query,sen,nouns],[out_box,sen_box])
193
- btn.click(get_nouns,inp,[sen,nouns])
194
  app.launch()
195
 
 
11
 
12
  text="""
13
  I asked Generative AI Models about their context window. Their response was intriguing.
 
14
  The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum amount of text the model can consider at any one time when generating a response. This includes both the prompt provided by the user and the model’s generated text.
15
+ In practical terms, the context window limits how much previous dialogue the model can “remember” during an interaction. If the interaction exceeds the context window, the model loses access to the earliest parts of the conversation. This limitation can impact the model’s consistency in long conversations or complex tasks.
16
+ I asked Generative AI Models about their context window. Their response was intriguing.
17
+ The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum amount of text the model can consider at any one time when generating a response. This includes both the prompt provided by the user and the model’s generated text.
18
+ In practical terms, the context window limits how much previous dialogue the model can “remember” during an interaction. If the interaction exceeds the context window, the model loses access to the earliest parts of the conversation. This limitation can impact the model’s consistency in long conversations or complex tasks.
19
+ I asked Generative AI Models about their context window. Their response was intriguing.
20
+ The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum amount of text the model can consider at any one time when generating a response. This includes both the prompt provided by the user and the model’s generated text.
21
+ In practical terms, the context window limits how much previous dialogue the model can “remember” during an interaction. If the interaction exceeds the context window, the model loses access to the earliest parts of the conversation. This limitation can impact the model’s consistency in long conversations or complex tasks.
22
+ I asked Generative AI Models about their context window. Their response was intriguing.
23
+ The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum amount of text the model can consider at any one time when generating a response. This includes both the prompt provided by the user and the model’s generated text.
24
+ In practical terms, the context window limits how much previous dialogue the model can “remember” during an interaction. If the interaction exceeds the context window, the model loses access to the earliest parts of the conversation. This limitation can impact the model’s consistency in long conversations or complex tasks.
25
+ I asked Generative AI Models about their context window. Their response was intriguing.
26
+ The context window for a large language model (LLM) like OpenAI’s GPT refers to the maximum amount of text the model can consider at any one time when generating a response. This includes both the prompt provided by the user and the model’s generated text.
27
  In practical terms, the context window limits how much previous dialogue the model can “remember” during an interaction. If the interaction exceeds the context window, the model loses access to the earliest parts of the conversation. This limitation can impact the model’s consistency in long conversations or complex tasks.
28
  """
29
 
 
192
  with gr.Row():
193
  query=gr.Textbox(label="Search query")
194
  search_btn=gr.Button("Search")
195
+ steps=gr.Number(value=1)
196
  out_box=gr.Textbox(label="Results")
197
  sen_box=gr.Textbox(label="Sentences")
198
  with gr.Row():
 
201
  with gr.Column(scale=1):
202
  nouns=gr.JSON(label="Nouns")
203
  search_btn.click(find_query,[query,sen,nouns],[out_box,sen_box])
204
+ btn.click(get_nouns,[inp,steps],[sen,nouns])
205
  app.launch()
206