Spaces:
Build error
Build error
--- | |
title: Google Gemini/Vertex | |
description: OpenHands uses LiteLLM to make calls to Google's chat models. You can find their documentation on using Google as a provider -> [Gemini - Google AI Studio](https://docs.litellm.ai/docs/providers/gemini), [VertexAI - Google Cloud Platform](https://docs.litellm.ai/docs/providers/vertex) | |
--- | |
## Gemini - Google AI Studio Configs | |
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings under the `LLM` tab: | |
- `LLM Provider` to `Gemini` | |
- `LLM Model` to the model you will be using. | |
If the model is not in the list, enable `Advanced` options, and enter it in `Custom Model` | |
(e.g. gemini/<model-name> like `gemini/gemini-2.0-flash`). | |
- `API Key` to your Gemini API key | |
## VertexAI - Google Cloud Platform Configs | |
To use Vertex AI through Google Cloud Platform when running OpenHands, you'll need to set the following environment | |
variables using `-e` in the [docker run command](../installation#running-openhands): | |
``` | |
GOOGLE_APPLICATION_CREDENTIALS="<json-dump-of-gcp-service-account-json>" | |
VERTEXAI_PROJECT="<your-gcp-project-id>" | |
VERTEXAI_LOCATION="<your-gcp-location>" | |
``` | |
Then set the following in the OpenHands UI through the Settings under the `LLM` tab: | |
- `LLM Provider` to `VertexAI` | |
- `LLM Model` to the model you will be using. | |
If the model is not in the list, enable `Advanced` options, and enter it in `Custom Model` | |
(e.g. vertex_ai/<model-name>). | |