Hugging Face's logo
Hiren122
/
Runtime error

runtime error

Exit code: 3. Reason: rgs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.13/site-packages/litellm/proxy/utils.py", line 1859, in check_view_exists ret = await self.db.query_raw( ^^^^^^^^^^^^^^^^^^^^^^^^ ...<13 lines>... ) ^ File "/usr/lib/python3.13/site-packages/prisma/client.py", line 463, in query_raw resp = await self._execute( ^^^^^^^^^^^^^^^^^^^^ ...<6 lines>... ) ^ File "/usr/lib/python3.13/site-packages/prisma/client.py", line 567, in _execute return await self._engine.query(builder.build(), tx_id=self._tx_id) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.13/site-packages/prisma/engine/query.py", line 244, in query return await self.request( ^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "/usr/lib/python3.13/site-packages/prisma/engine/http.py", line 97, in request raise errors.NotConnectedError('Not connected to the query engine') prisma.engine.errors.NotConnectedError: Not connected to the query engine 05:14:15 - LiteLLM Proxy:ERROR: utils.py:3002 - Error getting LiteLLM_SpendLogs row count: Not connected to the query engine ERROR: Application startup failed. Exiting. #------------------------------------------------------------# # # # 'This feature doesn't meet my needs because...' # # https://github.com/BerriAI/litellm/issues/new # # # #------------------------------------------------------------# Thank you for using LiteLLM! - Krrish & Ishaan Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM: Proxy initialized with Config, Set models:  groq-llama-3.1-8b  groq-llama-3.3-70b  groq-gemma2-9b  groq-whisper-v3  groq-whisper-v3-turbo

Container logs:

Fetching error logs...