runtime error
Exit code: 3. Reason: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpx/_client.py", line 1739, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpx/_client.py", line 1776, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py", line 376, in handle_async_request with map_httpcore_exceptions(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py", line 89, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: All connection attempts failed ERROR: Application startup failed. Exiting. [1;37m#------------------------------------------------------------#[0m [1;37m# #[0m [1;37m# 'The thing I wish you improved is...' #[0m [1;37m# https://github.com/BerriAI/litellm/issues/new #[0m [1;37m# #[0m [1;37m#------------------------------------------------------------#[0m Thank you for using LiteLLM! - Krrish & Ishaan [1;31mGive Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new[0m [32mLiteLLM: Proxy initialized with Config, Set models:[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m [32m Llama-3.1-8b-instant[0m
Container logs:
Fetching error logs...