File size: 1,534 Bytes
acc4ffe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
# Helicone

This page covers how to use the [Helicone](https://helicone.ai) within LangChain.

## What is Helicone?

Helicone is an [open source](https://github.com/Helicone/helicone) observability platform that proxies your OpenAI traffic and provides you key insights into your spend, latency and usage.

![Helicone](../_static/HeliconeDashboard.png)

## Quick start

With your LangChain environment you can just add the following parameter.

```bash
export OPENAI_API_BASE="https://oai.hconeai.com/v1"
```

Now head over to [helicone.ai](https://helicone.ai/onboarding?step=2) to create your account, and add your OpenAI API key within our dashboard to view your logs.

![Helicone](../_static/HeliconeKeys.png)

## How to enable Helicone caching

```python
from langchain.llms import OpenAI
import openai
openai.api_base = "https://oai.hconeai.com/v1"

llm = OpenAI(temperature=0.9, headers={"Helicone-Cache-Enabled": "true"})
text = "What is a helicone?"
print(llm(text))
```

[Helicone caching docs](https://docs.helicone.ai/advanced-usage/caching)

## How to use Helicone custom properties

```python
from langchain.llms import OpenAI
import openai
openai.api_base = "https://oai.hconeai.com/v1"

llm = OpenAI(temperature=0.9, headers={
        "Helicone-Property-Session": "24",
        "Helicone-Property-Conversation": "support_issue_2",
        "Helicone-Property-App": "mobile",
      })
text = "What is a helicone?"
print(llm(text))
```

[Helicone property docs](https://docs.helicone.ai/advanced-usage/custom-properties)