|
--- |
|
title: Petals |
|
--- |
|
|
|
To use Open Interpreter with a model from Petals, set the `` `/`: |
|
|
|
<CodeGroup> |
|
|
|
```bash Terminal |
|
interpreter --model petals//StableBeluga2 |
|
``` |
|
|
|
```python Python |
|
from interpreter import interpreter |
|
|
|
interpreter.llm.model = |
|
interpreter.chat() |
|
``` |
|
|
|
</ |
|
|
|
# Pre-Requisites |
|
|
|
Ensure |
|
|
|
```bash |
|
pip ///bigscience-workshop/ |
|
``` |
|
|
|
# Supported |
|
|
|
We ///bigscience-workshop/ |
|
|
|
<CodeGroup> |
|
|
|
```bash |
|
interpreter /petals-team/ |
|
interpreter /huggyllama/ |
|
``` |
|
|
|
```python |
|
interpreter.llm.model |
|
interpreter.llm.model |
|
``` |
|
|
|
</CodeGroup> |
|
|
|
# Required Environment Variables |
|
|
|
No environment variables are required to use these models. |