Corvius commited on
Commit
09d5baa
·
verified ·
1 Parent(s): 82addc1

SO BACK YEEEEEEEEEEEEEEEEEEEEEEEEEEE i love open router

Browse files

![IMG](https://m.media-amazon.com/images/I/61xRvjuinGL.jpg)
• Commission Rules
PLEASE READ THIS BEFORE YOU COMMISSION ME - I WILL NOT TAKE ANY RESPONSIBILITY IF YOU VIOLATE THEM.

⁃ You can order up to 2 commissions in 1 slot.
⁃ Please describe your details and inquiries as specifically as possible.
⁃ I have the right to turn down any commission that makes me feel uncomfortable or contains offensive materials.
⁃ I will only start working on the commission after receiving the payment (50% or 100%).
⁃ I will be sending preview pictures over time and I'll ask if I missed any details or if you want to change or fix anything, that’s your chance.
⁃ I would love to have you stay with me during the process if you can so I can work faster and do my best to meet your needs.
⁃ A commission usually takes 1⁃5 hrs or sometimes longer depending on the complex.
⁃ Your commission will be done in 1 or less than 2 weeks. (except comic commission)
[please do not rush me, I have my own life too, not just drawing your commissions every day, every time]
⁃ I will return 100% payment if I can't make it or something happens that I have to turn down the deal.
⁃ You have full permission and are free to do anything with your commission, please give credit if you share it somewhere (just some respect with the artist).
⁃ I’m not working just for the money, but for the good of yours and my works, so I'm certain that your drawing will be of the best quality, not just a quick drawing shit for money.
⁃ Any tips are very appreciated.
⁃ USD and PayPal only.

☑ What I WILL draw

⁃ Normal art [SFW]

Human, Furry, Animal, Birthday gift, Fanart, Pokemon, Digimon, Fakemon, OCs, shoujo/shounen-ai/ shoujo-ai, etc..

⁃ Adult content [R18/NSFW art]

Please don’t ask why I draw those things, it just works...


☒ What I WILL NOT draw

⁃ Too much blood, creepy, gore, violent, offensive stuff, etc...

(We can talk about this, if it is not too bad then I will take it)

{{Short description|YouTube web series by Alexey Gerasimov}}
{{Good article}}
{{pp-vandalism|small=yes}}
{{Use dmy dates|date=July 2024}}
{{Infobox television
| image = Skibidi toilet screenshot.webp
| image_alt = A computer render of a male human head with wide open eyes coming out of a toilet bowl, smiling
| caption = [[Thumbnail]] of the first episode depicting one of the titular Skibidi Toilets
| country = [[Georgia (country)|Georgia]]
| num_seasons = 24
| num_episodes = 77
| network = [[YouTube]]
| first_aired = {{Start date|2023|02|07}}
| last_aired = present
| creator = Alexey Gerasimov {{nowrap|(DaFuq!?Boom!)}}
| genre = [[Machinima]]
}}
'''''Skibidi Toilet''''' is a [[machinima]] [[web series]] created by <!--Gerasimov is not Georgian; he only lives in Georgia (the country). He is purportedly Russian, but no reliable sources have confirmed it yet.--> Alexey Gerasimov and released through [[YouTube]] videos and [[YouTube Shorts|shorts]] on his channel ''{{nowrap|DaFuq!?Boom!}}''. Produced using [[Source Filmmaker]], the series follows a fictional war between human-headed [[toilet]]s and [[humanoid]] characters with electronic devices for heads.

Since the first short was posted in February 2023, ''Skibidi Toilet'' has become [[Viral video|viral]] as an [[internet meme]] across various [[social media]] platforms, particularly among [[Generation Alpha]]. Many commentators saw their embrace of the series as Generation Alpha's first development of a unique [[internet culture]]. The show has a wide range of licensed products, and Gerasimov is "in talks" with [[Adam Goodman]] and [[Michael Bay]] for a movie and television series adaptation.

== Plot ==
The series depicts a conflict between singing human-headed toilets—the titular "Skibidi Toilets"—and humanoids with [[CCTV]] cameras, speakers, and televisions in place of their heads. The Skibidi Toilets, led by "G-Toilet", overtake humanity. Warfare soon develops between the toilets and the alliance of Cameramen and Speakermen. Each kind of the alliance has a colossal version of themselves, termed "Titans". The Titan Speakerman is infected with a mind-control parasite developed by the toilets' second-in-command and chief strategist, "Scientist Toilet", causing the Titan Speakerman to turn on the alliance. The alliance is expanded to include a species of TV-headed humanoids, and, with their help, Titan Speakerman is eventually cured.

As the military of both sides continue to advance technologically, the Titans attempt to hunt down G-Toilet. Though their combined powers are occasionally a match for his abilities, intervention by his legions each time allow him to escape. After a strike mission on the toilets' secret underground laboratory, the Scientist Toilet is finally defeated, but only one member of the team survives, a Cameraman called Plungerman. Having met a mysterious human seemingly involved in the creation of the toilets, the Plungerman is assassinated as a loose end. Meanwhile, fractures between the Skibidi Toilets and the "Astro Toilets", a mysterious splinter faction of powerful extraterrestrial toilets, erupt into violence, and the alliance and Skibidi Toilets enter into a makeshift alliance against their common enemy.

== Characteristics ==
The show contains references to video games, such as G-Toilet having the face of [[G-Man (Half-Life)|the G-Man]], a character from the [[Half-Life (series)|''Half-Life'']] video game series. The Speakermen's oft-performed dances are from the [[Battle royale game|battle royale]] game ''[[Fortnite]]''.<ref name="Lorenz-2023">{{Cite news |last=Lorenz |first=Taylor |author-link=Taylor Lorenz |date=10 December 2023 |title=How a toilet-themed YouTube series became the biggest thing online |url=https://www.washingtonpost.com/technology/2023/12/10/skibidi-toilets-you-tube-children-internet/ |url-status=live |archive-url=https://web.archive.org/web/20231216155122/https://www.washingtonpost.com/technology/2023/12/10/skibidi-toilets-you-tube-children-internet/ |archive-date=16 December 2023 |access-date=17 December 2023 |newspaper=[[Washington Post]] |language=en-US |issn=0190-8286}}</ref> ''[[Business Insider]]'' described the series as "an endless [[arms race]] as both the toilets and their foes [produce] stronger fighters".<ref name="Dodgson-2023">{{Cite web |last=Dodgson |first=Lindsay |date=14 October 2023 |title='Skibidi Toilet' isn't mindless — it's a 'cultural touchstone' that captures the anarchic spirit of the internet |url=https://www.insider.com/skibidi-toilet-is-good-actually-2023-10 |url-status=live |archive-url=https://web.archive.org/web/20240530042032/https://www.businessinsider.com/skibidi-toilet-is-good-actually-2023-10 |archive-date=30 May 2024 |access-date=2 November 2023 |website=[[Business Insider]] |language=en-US}}</ref> Technology website [[Wired (magazine)|''Wired'']] credited the largely [[dialogue]]-free nature of the show for removing [[language barriers]] and aiding in the show's global popularity.<ref name=":5">{{Cite magazine |last=Bumas |first=Adam |date=10 June 2024 |title=Who's Afraid of 'Skibidi Toilet'? |url=https://www.wired.com/story/whos-afraid-of-skibidi-toilet/ |access-date=18 June 2024 |magazine=[[Wired (magazine)|Wired]] |language=en-US |issn=1059-1028 |archive-date=13 June 2024 |archive-url=https://web.archive.org/web/20240613115923/https://www.wired.com/story/whos-afraid-of-skibidi-toilet/ |url-status=live }}</ref>

== History ==
{{anchor|Background and production}}
''Skibidi Toilet'' is produced by '''Alexey Gerasimov''' ({{Langx|ru|Алексей Герасимов}}, born 1997 or 1998),<ref name="Lorenz-2023" /> who is also known by his alias "'''Blugray'''", or the name of his YouTube channel, "'''DaFuq!?Boom!'''"<ref name="Lang-2023">{{Cite web |last=Lang |first=Jamie |date=3 July 2023 |title=How The Animation Channel DaFuq!?Boom! Became Youtube's Biggest Hit This Summer |url=https://www.cartoonbrew.com/streaming/dafuqboom-youtube-alexey-gerasimov-skibidi-toilet-230239.html |url-status=live |archive-url=https://web.archive.org/web/20230806021855/https://www.cartoonbrew.com/streaming/dafuqboom-youtube-alexey-gerasimov-skibidi-toilet-230239.html |archive-date=6 August 2023 |access-date=25 November 2023 |website=[[Cartoon Brew]] |language=en-US}}</ref><ref name=":6">{{Cite news |last=Lorenz |first=Taylor |author-link=Taylor Lorenz |date=July 24, 2024 |title=How 'Skibidi Toilet' became one of the most valuable franchises in Hollywood |url=https://www.washingtonpost.com/technology/2024/07/24/skibidi-toilet-movie-tv-franchise-youtube-michael-bay/ |access-date=July 28, 2024 |newspaper=[[Washington Post]] |archive-date=26 July 2024 |archive-url=https://web.archive.org/web/20240726222133/https://www.washingtonpost.com/technology/2024/07/24/skibidi-toilet-movie-tv-franchise-youtube-michael-bay/ |url-status=live }}</ref> Since 2014, he has been learning animation on his own. He lives in the country of [[Georgia (country)|Georgia]],{{NoteTag|Sources differ on whether Gerasimov merely resides in Georgia<ref name="Lang-2023" /><ref name="Smith-2023" /><ref name="Greig-2023" /><ref name=":4" /><ref name=":9" /> or is actually of Georgian nationality.<ref name=":2" /><ref>{{Cite web |last=France |first=Lisa Respers |date=2024-07-25 |title='Skibidi Toilet:' If you don't know what it is, you will |url=https://www.cnn.com/2024/07/25/entertainment/skibidi-toilet-explainer/index.html |access-date=2024-07-30 |website=[[CNN]] |language=en |archive-date=27 July 2024 |archive-url=https://web.archive.org/web/20240727083422/https://www.cnn.com/2024/07/25/entertainment/skibidi-toilet-explainer/index.html |url-status=live }}</ref> Some sources used the wording "from Georgia".<ref name=":10" /><ref name=":11" /><ref name=":12" />}} though according to [[IrishStar.com]], he is originally from Russia and only moved to Georgia in 2019.<ref>{{Cite web |last=Harris |first=Peter |date=2024-10-23 |title=Skibidi Toilet explained from origins to Michael Bay movie |url=https://www.irishs

Files changed (1) hide show
  1. app.py +83 -64
app.py CHANGED
@@ -6,33 +6,35 @@ import datetime
6
  from requests.exceptions import RequestException
7
 
8
  API_URL = os.environ.get('API_URL')
9
- API_KEY = os.environ.get('API_KEY')
 
10
 
11
- headers = {
12
- "Authorization": f"Bearer {API_KEY}",
13
- "Content-Type": "application/json",
14
- 'Referer': os.environ.get('REFERRER_URL')
15
- }
 
 
16
 
17
- # debug switches
18
  USER_LOGGING_ENABLED = False
19
  RESPONSE_LOGGING_ENABLED = True
20
 
21
  DEFAULT_PARAMS = {
22
  "temperature": 0.8,
23
  "top_p": 0.95,
24
- "top_k": 40,
25
  "frequency_penalty": 0,
26
  "presence_penalty": 0,
27
- "repetition_penalty": 1.1,
28
  "max_tokens": 512
29
  }
30
 
31
  def get_timestamp():
32
  return datetime.datetime.now().strftime("%H:%M:%S")
33
 
34
- def predict(message, history, system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag):
35
- history_format = [{"role": "system", "content": system_prompt}]
 
36
  for human, assistant in history:
37
  history_format.append({"role": "user", "content": human})
38
  if assistant:
@@ -46,62 +48,79 @@ def predict(message, history, system_prompt, temperature, top_p, top_k, frequenc
46
  current_params = {
47
  "temperature": temperature,
48
  "top_p": top_p,
49
- "top_k": top_k,
50
  "frequency_penalty": frequency_penalty,
51
  "presence_penalty": presence_penalty,
52
- "repetition_penalty": repetition_penalty,
53
- "max_tokens": max_tokens
54
  }
55
 
56
  non_default_params = {k: v for k, v in current_params.items() if v != DEFAULT_PARAMS[k]}
57
-
58
  if USER_LOGGING_ENABLED and non_default_params and not message.startswith(('*', '"')):
59
  for param, value in non_default_params.items():
60
  print(f"{param}={value}")
61
 
62
  data = {
63
- "model": "meta-llama/Meta-Llama-3.1-70B-Instruct",
64
  "messages": history_format,
65
  "stream": True,
66
- "temperature": temperature,
67
- "top_p": top_p,
68
- "top_k": top_k,
69
- "frequency_penalty": frequency_penalty,
70
- "presence_penalty": presence_penalty,
71
- "repetition_penalty": repetition_penalty,
72
- "max_tokens": max_tokens
73
  }
74
 
75
- try:
76
- with requests.post(API_URL, headers=headers, data=json.dumps(data), stream=True) as response:
77
- partial_message = ""
78
- for line in response.iter_lines():
79
- if stop_flag[0]:
80
- response.close()
81
- break
82
- if line:
83
- line = line.decode('utf-8')
84
- if RESPONSE_LOGGING_ENABLED:
85
- print(f"API Response: {line}")
86
- if line.startswith("data: "):
87
- if line.strip() == "data: [DONE]":
88
- break
89
- try:
90
- json_data = json.loads(line[6:])
91
- if 'choices' in json_data and json_data['choices']:
92
- content = json_data['choices'][0]['delta'].get('content', '')
93
- if content:
94
- partial_message += content
95
- yield partial_message
96
- except json.JSONDecodeError:
97
- continue
98
-
99
- if partial_message:
100
- yield partial_message
101
-
102
- except RequestException as e:
103
- print(f"Request error: {e}")
104
- yield f"An error occurred: {str(e)}"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
105
 
106
  def import_chat(custom_format_string):
107
  try:
@@ -129,7 +148,9 @@ def import_chat(custom_format_string):
129
  return None, None
130
 
131
  def export_chat(history, system_prompt):
132
- export_data = f"<|system|> {system_prompt}\n\n"
 
 
133
  if history is not None:
134
  for user_msg, assistant_msg in history:
135
  export_data += f"<|user|> {user_msg}\n\n"
@@ -143,11 +164,11 @@ def stop_generation_func(stop_flag):
143
 
144
  with gr.Blocks(theme='gradio/monochrome') as demo:
145
  stop_flag = gr.State([False])
146
-
147
  with gr.Row():
148
  with gr.Column(scale=2):
149
  chatbot = gr.Chatbot(value=[])
150
- msg = gr.Textbox(label="Message (dolphin-2.9.1-llama-3-70b for now. The provider might bug out at random. The space may restart frequently)")
151
  with gr.Row():
152
  clear = gr.Button("Clear")
153
  regenerate = gr.Button("Regenerate")
@@ -163,23 +184,21 @@ with gr.Blocks(theme='gradio/monochrome') as demo:
163
  system_prompt = gr.Textbox("", label="System Prompt", lines=5)
164
  temperature = gr.Slider(0, 2, value=0.8, step=0.01, label="Temperature")
165
  top_p = gr.Slider(0, 1, value=0.95, step=0.01, label="Top P")
166
- top_k = gr.Slider(1, 500, value=40, step=1, label="Top K")
167
  frequency_penalty = gr.Slider(-2, 2, value=0, step=0.1, label="Frequency Penalty")
168
  presence_penalty = gr.Slider(-2, 2, value=0, step=0.1, label="Presence Penalty")
169
- repetition_penalty = gr.Slider(0.01, 5, value=1.1, step=0.01, label="Repetition Penalty")
170
  max_tokens = gr.Slider(1, 4096, value=512, step=1, label="Max Output (max_tokens)")
171
 
172
  def user(user_message, history):
173
  history = history or []
174
  return "", history + [[user_message, None]]
175
 
176
- def bot(history, system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag):
177
  stop_flag[0] = False
178
  history = history or []
179
  if not history:
180
  return history
181
  user_message = history[-1][0]
182
- bot_message = predict(user_message, history[:-1], system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag)
183
  history[-1][1] = ""
184
  for chunk in bot_message:
185
  if stop_flag[0]:
@@ -188,11 +207,11 @@ with gr.Blocks(theme='gradio/monochrome') as demo:
188
  history[-1][1] = chunk
189
  yield history
190
 
191
- def regenerate_response(history, system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag):
192
  if history and len(history) > 0:
193
  last_user_message = history[-1][0]
194
  history[-1][1] = None
195
- for new_history in bot(history, system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag):
196
  yield new_history
197
  else:
198
  yield []
@@ -202,14 +221,14 @@ with gr.Blocks(theme='gradio/monochrome') as demo:
202
  return imported_history, imported_system_prompt
203
 
204
  msg.submit(user, [msg, chatbot], [msg, chatbot], queue=False).then(
205
- bot, [chatbot, system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag], chatbot
206
  )
207
 
208
  clear.click(lambda: None, None, chatbot, queue=False)
209
 
210
  regenerate.click(
211
  regenerate_response,
212
- [chatbot, system_prompt, temperature, top_p, top_k, frequency_penalty, presence_penalty, repetition_penalty, max_tokens, stop_flag],
213
  chatbot
214
  )
215
 
 
6
  from requests.exceptions import RequestException
7
 
8
  API_URL = os.environ.get('API_URL')
9
+ if API_URL is None:
10
+ raise ValueError("API_URL not set in env.")
11
 
12
+ API_KEYS = os.environ.get('API_KEYS')
13
+ if API_KEYS is None:
14
+ raise ValueError("no keys in env")
15
+
16
+ api_keys_list = [key.strip() for key in API_KEYS.strip().splitlines() if key.strip()]
17
+ if not api_keys_list:
18
+ raise ValueError("no valid keys in env")
19
 
20
+ # dee baag seeweeechieeezzz u got no beechieezzz
21
  USER_LOGGING_ENABLED = False
22
  RESPONSE_LOGGING_ENABLED = True
23
 
24
  DEFAULT_PARAMS = {
25
  "temperature": 0.8,
26
  "top_p": 0.95,
 
27
  "frequency_penalty": 0,
28
  "presence_penalty": 0,
 
29
  "max_tokens": 512
30
  }
31
 
32
  def get_timestamp():
33
  return datetime.datetime.now().strftime("%H:%M:%S")
34
 
35
+ def predict(message, history, system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag):
36
+
37
+ history_format = [{"role": "system", "content": system_prompt}] if system_prompt else []
38
  for human, assistant in history:
39
  history_format.append({"role": "user", "content": human})
40
  if assistant:
 
48
  current_params = {
49
  "temperature": temperature,
50
  "top_p": top_p,
 
51
  "frequency_penalty": frequency_penalty,
52
  "presence_penalty": presence_penalty,
53
+ "max_tokens": int(max_tokens)
 
54
  }
55
 
56
  non_default_params = {k: v for k, v in current_params.items() if v != DEFAULT_PARAMS[k]}
57
+
58
  if USER_LOGGING_ENABLED and non_default_params and not message.startswith(('*', '"')):
59
  for param, value in non_default_params.items():
60
  print(f"{param}={value}")
61
 
62
  data = {
63
+ "model": "meta/llama-3.1-405b-instruct",
64
  "messages": history_format,
65
  "stream": True,
66
+ **current_params
 
 
 
 
 
 
67
  }
68
 
69
+ partial_message = ""
70
+
71
+ for api_key in api_keys_list:
72
+ headers = {
73
+ "Authorization": f"Bearer {api_key}",
74
+ "Content-Type": "application/json",
75
+ }
76
+
77
+ try:
78
+ response = requests.post(API_URL, headers=headers, data=json.dumps(data), stream=True)
79
+ if response.status_code == 200:
80
+
81
+ for line in response.iter_lines():
82
+ if stop_flag[0]:
83
+ response.close()
84
+ break
85
+ if line:
86
+ line = line.decode('utf-8')
87
+ if RESPONSE_LOGGING_ENABLED:
88
+ print(f"API Response: {line}")
89
+ if line.startswith("data: "):
90
+ if line.strip() == "data: [DONE]":
91
+ break
92
+ try:
93
+ json_data = json.loads(line[6:])
94
+ if 'choices' in json_data and json_data['choices']:
95
+ delta = json_data['choices'][0]['delta']
96
+ content = delta.get('content', '')
97
+ if content:
98
+ partial_message += content
99
+ yield partial_message
100
+ except json.JSONDecodeError:
101
+ continue
102
+ if partial_message:
103
+ yield partial_message
104
+ return
105
+ elif response.status_code == 429:
106
+
107
+ print(f"API key {api_key} hit rate limit, trying next key.")
108
+ continue
109
+ else:
110
+
111
+ response_text = response.text
112
+ print(f"Request error with API key {api_key}: {response.status_code} {response_text}")
113
+ yield f"An error occurred: {response_text}"
114
+ return
115
+
116
+ except RequestException as e:
117
+
118
+ print(f"Request exception with API key {api_key}: {e}")
119
+ yield f"An error occurred: {str(e)}"
120
+ return
121
+
122
+ print("All keys rate limited or ded.")
123
+ yield "All keys rate limited or ded."
124
 
125
  def import_chat(custom_format_string):
126
  try:
 
148
  return None, None
149
 
150
  def export_chat(history, system_prompt):
151
+ export_data = ""
152
+ if system_prompt:
153
+ export_data += f"<|system|> {system_prompt}\n\n"
154
  if history is not None:
155
  for user_msg, assistant_msg in history:
156
  export_data += f"<|user|> {user_msg}\n\n"
 
164
 
165
  with gr.Blocks(theme='gradio/monochrome') as demo:
166
  stop_flag = gr.State([False])
167
+
168
  with gr.Row():
169
  with gr.Column(scale=2):
170
  chatbot = gr.Chatbot(value=[])
171
+ msg = gr.Textbox(label="Message")
172
  with gr.Row():
173
  clear = gr.Button("Clear")
174
  regenerate = gr.Button("Regenerate")
 
184
  system_prompt = gr.Textbox("", label="System Prompt", lines=5)
185
  temperature = gr.Slider(0, 2, value=0.8, step=0.01, label="Temperature")
186
  top_p = gr.Slider(0, 1, value=0.95, step=0.01, label="Top P")
 
187
  frequency_penalty = gr.Slider(-2, 2, value=0, step=0.1, label="Frequency Penalty")
188
  presence_penalty = gr.Slider(-2, 2, value=0, step=0.1, label="Presence Penalty")
 
189
  max_tokens = gr.Slider(1, 4096, value=512, step=1, label="Max Output (max_tokens)")
190
 
191
  def user(user_message, history):
192
  history = history or []
193
  return "", history + [[user_message, None]]
194
 
195
+ def bot(history, system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag):
196
  stop_flag[0] = False
197
  history = history or []
198
  if not history:
199
  return history
200
  user_message = history[-1][0]
201
+ bot_message = predict(user_message, history[:-1], system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag)
202
  history[-1][1] = ""
203
  for chunk in bot_message:
204
  if stop_flag[0]:
 
207
  history[-1][1] = chunk
208
  yield history
209
 
210
+ def regenerate_response(history, system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag):
211
  if history and len(history) > 0:
212
  last_user_message = history[-1][0]
213
  history[-1][1] = None
214
+ for new_history in bot(history, system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag):
215
  yield new_history
216
  else:
217
  yield []
 
221
  return imported_history, imported_system_prompt
222
 
223
  msg.submit(user, [msg, chatbot], [msg, chatbot], queue=False).then(
224
+ bot, [chatbot, system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag], chatbot
225
  )
226
 
227
  clear.click(lambda: None, None, chatbot, queue=False)
228
 
229
  regenerate.click(
230
  regenerate_response,
231
+ [chatbot, system_prompt, temperature, top_p, frequency_penalty, presence_penalty, max_tokens, stop_flag],
232
  chatbot
233
  )
234