runtime error

Exit code: 1. Reason: Model 'codellama/CodeLlama-7b-Instruct-hf' is not supported for this environment. Supported models: ['ibm/granite-13b-instruct-v2', 'ibm/granite-20b-multilingual', 'ibm/granite-3-2b-instruct', 'ibm/granite-3-8b-instruct', 'ibm/granite-8b-code-instruct', 'ibm/granite-guardian-3-2b', 'ibm/granite-guardian-3-8b', 'meta-llama/llama-3-1-70b-instruct', 'meta-llama/llama-3-2-11b-vision-instruct', 'meta-llama/llama-3-2-90b-vision-instruct', 'meta-llama/llama-3-70b-instruct', 'meta-llama/llama-guard-3-11b-vision', 'mistralai/mistral-large'] Traceback (most recent call last): File "/home/user/app/app.py", line 28, in <module> model = ModelInference( File "/usr/local/lib/python3.10/site-packages/ibm_watsonx_ai/foundation_models/inference/model_inference.py", line 201, in __init__ self._inference = FMModelInference( File "/usr/local/lib/python3.10/site-packages/ibm_watsonx_ai/foundation_models/inference/fm_model_inference.py", line 93, in __init__ raise WMLClientError( ibm_watsonx_ai.wml_client_error.WMLClientError: Model 'codellama/CodeLlama-7b-Instruct-hf' is not supported for this environment. Supported models: ['ibm/granite-13b-instruct-v2', 'ibm/granite-20b-multilingual', 'ibm/granite-3-2b-instruct', 'ibm/granite-3-8b-instruct', 'ibm/granite-8b-code-instruct', 'ibm/granite-guardian-3-2b', 'ibm/granite-guardian-3-8b', 'meta-llama/llama-3-1-70b-instruct', 'meta-llama/llama-3-2-11b-vision-instruct', 'meta-llama/llama-3-2-90b-vision-instruct', 'meta-llama/llama-3-70b-instruct', 'meta-llama/llama-guard-3-11b-vision', 'mistralai/mistral-large']

Container logs:

Fetching error logs...