Spaces:
Sleeping
Sleeping
Merge pull request #16 from ping98k/codex/add-info-to-base-api-for-litellm
Browse files
README.md
CHANGED
@@ -29,6 +29,12 @@ This project provides a small interface for running "tournaments" between langua
|
|
29 |
`chat_template_kwargs={"enable_thinking": True}` with each
|
30 |
`litellm.completion` call for that model. Otherwise it sends
|
31 |
`chat_template_kwargs={"enable_thinking": False}`.
|
|
|
|
|
|
|
|
|
|
|
|
|
32 |
2. Install dependencies (example with `pip`):
|
33 |
```bash
|
34 |
pip install gradio litellm python-dotenv tqdm matplotlib
|
|
|
29 |
`chat_template_kwargs={"enable_thinking": True}` with each
|
30 |
`litellm.completion` call for that model. Otherwise it sends
|
31 |
`chat_template_kwargs={"enable_thinking": False}`.
|
32 |
+
|
33 |
+
The app uses [LiteLLM](https://github.com/BerriAI/litellm) to talk to
|
34 |
+
language models. If you leave `OPENAI_API_BASE` blank, LiteLLM defaults to
|
35 |
+
`https://api.openai.com/v1`. When the "API Token" field in the interface is
|
36 |
+
empty, the value from `OPENAI_API_KEY` will be used. These defaults let you
|
37 |
+
quickly connect to OpenAI without extra configuration.
|
38 |
2. Install dependencies (example with `pip`):
|
39 |
```bash
|
40 |
pip install gradio litellm python-dotenv tqdm matplotlib
|