File size: 1,277 Bytes
8d88d9b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
# Cloudflare

| Feature                     | Available |
| --------------------------- | --------- |
| [Tools](../tools)           | No        |
| [Multimodal](../multimodal) | No        |

You may use Cloudflare Workers AI to run your own models with serverless inference.

You will need to have a Cloudflare account, then get your [account ID](https://developers.cloudflare.com/fundamentals/setup/find-account-and-zone-ids/) as well as your [API token](https://developers.cloudflare.com/workers-ai/get-started/rest-api/#1-get-an-api-token) for Workers AI.

You can either specify them directly in your `.env.local` using the `CLOUDFLARE_ACCOUNT_ID` and `CLOUDFLARE_API_TOKEN` variables, or you can set them directly in the endpoint config.

You can find the list of models available on Cloudflare [here](https://developers.cloudflare.com/workers-ai/models/#text-generation).

```ini
MODELS=`[
  {
    "name" : "nousresearch/hermes-2-pro-mistral-7b",
    "tokenizer": "nousresearch/hermes-2-pro-mistral-7b",
    "parameters": {
      "stop": ["<|im_end|>"]
    },
    "endpoints" : [
      {
        "type" : "cloudflare"
        <!-- optionally specify these
        "accountId": "your-account-id",
        "authToken": "your-api-token"
        -->
      }
    ]
  }
]`
```