|
--- |
|
title: Perplexity |
|
--- |
|
|
|
To use Open Interpreter with the Perplexity API, set the `` |
|
|
|
<CodeGroup> |
|
|
|
```bash |
|
interpreter /<perplexity-model> |
|
``` |
|
|
|
```python Python |
|
from interpreter import interpreter |
|
|
|
interpreter.llm.model = |
|
interpreter.chat() |
|
``` |
|
|
|
</ |
|
|
|
# Supported |
|
|
|
We |
|
|
|
- pplx-7b-chat |
|
- pplx-70b-chat |
|
- pplx-7b-online |
|
- pplx-70b-online |
|
- codellama-34b-instruct |
|
- llama-2-13b-chat |
|
- llama-2-70b-chat |
|
- mistral-7b-instruct |
|
- openhermes-2-mistral-7b |
|
- openhermes-2.5-mistral-7b |
|
- pplx-7b-chat-alpha |
|
- pplx-70b-chat-alpha |
|
|
|
<CodeGroup> |
|
|
|
```bash |
|
|
|
interpreter /pplx-7b-chat |
|
interpreter --model perplexity/ |
|
interpreter /pplx-7b-online |
|
interpreter --model perplexity/ |
|
interpreter /codellama-34b-instruct |
|
interpreter --model perplexity/ |
|
interpreter /llama-2-70b-chat |
|
interpreter --model perplexity/ |
|
interpreter /openhermes-2-mistral-7b |
|
interpreter --model perplexity/ |
|
interpreter /pplx-7b-chat-alpha |
|
interpreter --model perplexity/ |
|
``` |
|
|
|
```python |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
``` |
|
|
|
</CodeGroup> |
|
|
|
# Required Environment Variables |
|
|
|
Set the following environment variables [(click here to learn how)](https: |
|
|
|
| Environment Variable | Description | Where to Find | |
|
| ----------------------- | ------------------------------------ | ----------------------------------------------------------------- | |
|
| `PERPLEXITYAI_API_KEY |
|
|