File size: 3,140 Bytes
479062c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a39a68c
ec78b4a
479062c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a39a68c
 
ec78b4a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
# API Documentation for `Lenylvt/Translator-API`

This documentation explains how to interact with the Translator API using both Python and JavaScript.

## API Endpoint

To interact with this API, you have the option to use the `gradio_client` Python library or the `@gradio/client` JavaScript package.

## Python Usage

### Step 1: Installation

First, install the `gradio_client` library if it's not already installed.

```python
pip install gradio_client
```

### Step 2: Making a Request

Locate the API endpoint for the function you intend to use. Replace the placeholder values in the snippet below with your actual input data. If accessing a private Space, you may need to include your Hugging Face token.

**API Name**: `/predict`

```python
from gradio_client import Client

client = Client("Lenylvt/Translator-API")
result = client.predict(
    "Hello!!",  # str in 'text' Textbox component
    "en",       # Source Language (ISO 639-1 code, e.g., 'en' for English) in 'Source Language' Dropdown component
    "es",       # Target Language (ISO 639-1 code, e.g., 'es' for Spanish) in 'Target Language' Dropdown component
    api_name="/predict"
)
print(result)
```

**Return Type(s):**

- A `str` representing the translated text output in the 'output' Textbox component.
  
πŸ”΄ **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`', **its because the language is not available.**

## JavaScript Usage

### Step 1: Installation

Install the `@gradio/client` package if it's not already in your project.

```bash
npm i -D @gradio/client
```

### Step 2: Making a Request

As with Python, identify the API endpoint that matches your requirement. Replace the placeholders with your data. If this is a private Space, don't forget to include your Hugging Face token.

**API Name**: `/predict`

```javascript
import { client } from "@gradio/client";

const app = await client("Lenylvt/Translator-API");
const result = await app.predict("/predict", [        
    "Hello!!", // string in 'text' Textbox component        
    "en",      // string representing ISO 639-1 code for Source Language in 'Source Language' Dropdown component        
    "es",      // string representing ISO 639-1 code for Target Language in 'Target Language' Dropdown component
]);

console.log(result.data);
```

**Return Type(s):**

- A `string` representing the translated text output in the 'output' Textbox component.
  
πŸ”΄ **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`', **its because the language is not available.**