runtime error
Exit code: 1. Reason: rediction = await Context.root_block.process_api( File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1923, in process_api result = await self.call_function( File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1508, in call_function prediction = await anyio.to_thread.run_sync( # type: ignore File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread return await future File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run result = context.run(func, *args) File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 818, in wrapper response = f(*args, **kwargs) File "/home/user/app/app.py", line 79, in parse_and_display text = get_gorilla_response(prompt, model) File "/home/user/app/app.py", line 30, in get_gorilla_response completion = openai.ChatCompletion.create( File "/usr/local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create response, _, api_key = requestor.request( File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 289, in request result = self.request_raw( File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 619, in request_raw raise error.APIConnectionError( openai.error.APIConnectionError: Error communicating with OpenAI: HTTPConnectionPool(host='zanino.millennium.berkeley.edu', port=8000): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f5fcc945db0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Container logs:
Fetching error logs...