File size: 2,312 Bytes
1307964
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
const fetch = require('node-fetch').default;
const { setAdditionalHeadersByType } = require('../additional-headers');
const { TEXTGEN_TYPES } = require('../constants');

/**
 * Gets the vector for the given text from Ollama
 * @param {string[]} texts - The array of texts to get the vectors for
 * @param {string} apiUrl - The API URL
 * @param {string} model - The model to use
 * @param {boolean} keep - Keep the model loaded in memory
 * @param {import('../users').UserDirectoryList} directories - The directories object for the user
 * @returns {Promise<number[][]>} - The array of vectors for the texts
 */
async function getOllamaBatchVector(texts, apiUrl, model, keep, directories) {
    const result = [];
    for (const text of texts) {
        const vector = await getOllamaVector(text, apiUrl, model, keep, directories);
        result.push(vector);
    }
    return result;
}

/**
 * Gets the vector for the given text from Ollama
 * @param {string} text - The text to get the vector for
 * @param {string} apiUrl - The API URL
 * @param {string} model - The model to use
 * @param {boolean} keep - Keep the model loaded in memory
 * @param {import('../users').UserDirectoryList} directories - The directories object for the user
 * @returns {Promise<number[]>} - The vector for the text
 */
async function getOllamaVector(text, apiUrl, model, keep, directories) {
    const url = new URL(apiUrl);
    url.pathname = '/api/embeddings';

    const headers = {};
    setAdditionalHeadersByType(headers, TEXTGEN_TYPES.OLLAMA, apiUrl, directories);

    const response = await fetch(url, {
        method: 'POST',
        headers: {
            'Content-Type': 'application/json',
            ...headers,
        },
        body: JSON.stringify({
            prompt: text,
            model: model,
            keep_alive: keep ? -1 : undefined,
        }),
    });

    if (!response.ok) {
        const responseText = await response.text();
        throw new Error(`Ollama: Failed to get vector for text: ${response.statusText} ${responseText}`);
    }

    const data = await response.json();

    if (!Array.isArray(data?.embedding)) {
        throw new Error('API response was not an array');
    }

    return data.embedding;
}

module.exports = {
    getOllamaBatchVector,
    getOllamaVector,
};