File size: 12,388 Bytes
ee6e328
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
<!--Copyright 2020 The HuggingFace Team. All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the

⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
rendered properly in your Markdown viewer.

-->

# μ–΄λ–»κ²Œ μ‚¬μš©μž μ •μ˜ νŒŒμ΄ν”„λΌμΈμ„ μƒμ„±ν•˜λ‚˜μš”? [[how-to-create-a-custom-pipeline]]

이 κ°€μ΄λ“œμ—μ„œλŠ” μ‚¬μš©μž μ •μ˜ νŒŒμ΄ν”„λΌμΈμ„ μ–΄λ–»κ²Œ μƒμ„±ν•˜κ³  [ν—ˆλΈŒ](hf.co/models)에 κ³΅μœ ν•˜κ±°λ‚˜ πŸ€— Transformers λΌμ΄λΈŒλŸ¬λ¦¬μ— μΆ”κ°€ν•˜λŠ” 방법을 μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.

λ¨Όμ € νŒŒμ΄ν”„λΌμΈμ΄ μˆ˜μš©ν•  수 μžˆλŠ” μ›μ‹œ μž…λ ₯을 κ²°μ •ν•΄μ•Ό ν•©λ‹ˆλ‹€.
λ¬Έμžμ—΄, μ›μ‹œ λ°”μ΄νŠΈ, λ”•μ…”λ„ˆλ¦¬ λ˜λŠ” κ°€μž₯ μ›ν•˜λŠ” μž…λ ₯일 κ°€λŠ₯성이 높은 것이면 무엇이든 κ°€λŠ₯ν•©λ‹ˆλ‹€.
이 μž…λ ₯을 κ°€λŠ₯ν•œ ν•œ μˆœμˆ˜ν•œ Python ν˜•μ‹μœΌλ‘œ μœ μ§€ν•΄μ•Ό (JSON을 톡해 λ‹€λ₯Έ 언어와도) ν˜Έν™˜μ„±μ΄ μ’‹μ•„μ§‘λ‹ˆλ‹€.
이것이 μ „μ²˜λ¦¬(`preprocess`) νŒŒμ΄ν”„λΌμΈμ˜ μž…λ ₯(`inputs`)이 될 κ²ƒμž…λ‹ˆλ‹€.

그런 λ‹€μŒ `outputs`λ₯Ό μ •μ˜ν•˜μ„Έμš”.
`inputs`와 같은 정책을 λ”°λ₯΄κ³ , κ°„λ‹¨ν• μˆ˜λ‘ μ’‹μŠ΅λ‹ˆλ‹€.
이것이 ν›„μ²˜λ¦¬(`postprocess`) λ©”μ†Œλ“œμ˜ 좜λ ₯이 될 κ²ƒμž…λ‹ˆλ‹€.

λ¨Όμ € 4개의 λ©”μ†Œλ“œ(`preprocess`, `_forward`, `postprocess` 및 `_sanitize_parameters`)λ₯Ό κ΅¬ν˜„ν•˜κΈ° μœ„ν•΄ κΈ°λ³Έ 클래슀 `Pipeline`을 μƒμ†ν•˜μ—¬ μ‹œμž‘ν•©λ‹ˆλ‹€.


```python
from transformers import Pipeline


class MyPipeline(Pipeline):
    def _sanitize_parameters(self, **kwargs):
        preprocess_kwargs = {}
        if "maybe_arg" in kwargs:
            preprocess_kwargs["maybe_arg"] = kwargs["maybe_arg"]
        return preprocess_kwargs, {}, {}

    def preprocess(self, inputs, maybe_arg=2):
        model_input = Tensor(inputs["input_ids"])
        return {"model_input": model_input}

    def _forward(self, model_inputs):
        # model_inputs == {"model_input": model_input}
        outputs = self.model(**model_inputs)
        # Maybe {"logits": Tensor(...)}
        return outputs

    def postprocess(self, model_outputs):
        best_class = model_outputs["logits"].softmax(-1)
        return best_class
```

이 λΆ„ν•  κ΅¬μ‘°λŠ” CPU/GPU에 λŒ€ν•œ 비ꡐ적 μ›ν™œν•œ 지원을 μ œκ³΅ν•˜λŠ” λ™μ‹œμ—, λ‹€λ₯Έ μŠ€λ ˆλ“œμ—μ„œ CPU에 λŒ€ν•œ 사전/사후 처리λ₯Ό μˆ˜ν–‰ν•  수 있게 μ§€μ›ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.

`preprocess`λŠ” μ›λž˜ μ •μ˜λœ μž…λ ₯을 가져와 λͺ¨λΈμ— 곡급할 수 μžˆλŠ” ν˜•μ‹μœΌλ‘œ λ³€ν™˜ν•©λ‹ˆλ‹€.
더 λ§Žμ€ 정보λ₯Ό 포함할 수 있으며 일반적으둜 `Dict` ν˜•νƒœμž…λ‹ˆλ‹€.

`_forward`λŠ” κ΅¬ν˜„ μ„ΈλΆ€ 사항이며 직접 ν˜ΈμΆœν•  수 μ—†μŠ΅λ‹ˆλ‹€. 
`forward`λŠ” μ˜ˆμƒ μž₯μΉ˜μ—μ„œ λͺ¨λ“  것이 μž‘λ™ν•˜λŠ”μ§€ ν™•μΈν•˜κΈ° μœ„ν•œ μ•ˆμ „μž₯μΉ˜κ°€ ν¬ν•¨λ˜μ–΄ μžˆμ–΄ μ„ ν˜Έλ˜λŠ” 호좜 λ©”μ†Œλ“œμž…λ‹ˆλ‹€.
μ‹€μ œ λͺ¨λΈκ³Ό κ΄€λ ¨λœ 것은 `_forward` λ©”μ†Œλ“œμ— μ†ν•˜λ©°, λ‚˜λ¨Έμ§€λŠ” μ „μ²˜λ¦¬/ν›„μ²˜λ¦¬ 과정에 μžˆμŠ΅λ‹ˆλ‹€.

`postprocess` λ©”μ†Œλ“œλŠ” `_forward`의 좜λ ₯을 가져와 이전에 κ²°μ •ν•œ μ΅œμ’… 좜λ ₯ ν˜•μ‹μœΌλ‘œ λ³€ν™˜ν•©λ‹ˆλ‹€.

`_sanitize_parameters`λŠ” μ΄ˆκΈ°ν™” μ‹œκ°„μ— `pipeline(...., maybe_arg=4)`μ΄λ‚˜ 호좜 μ‹œκ°„μ— `pipe = pipeline(...); output = pipe(...., maybe_arg=4)`κ³Ό 같이, μ‚¬μš©μžκ°€ μ›ν•˜λŠ” 경우 μ–Έμ œλ“ μ§€ λ§€κ°œλ³€μˆ˜λ₯Ό 전달할 수 μžˆλ„λ‘ ν—ˆμš©ν•©λ‹ˆλ‹€.

`_sanitize_parameters`의 λ°˜ν™˜ 값은 `preprocess`, `_forward`, `postprocess`에 직접 μ „λ‹¬λ˜λŠ” 3개의 kwargs λ”•μ…”λ„ˆλ¦¬μž…λ‹ˆλ‹€.
ν˜ΈμΆœμžκ°€ μΆ”κ°€ λ§€κ°œλ³€μˆ˜λ‘œ ν˜ΈμΆœν•˜μ§€ μ•Šμ•˜λ‹€λ©΄ 아무것도 μ±„μš°μ§€ λ§ˆμ‹­μ‹œμ˜€.
μ΄λ ‡κ²Œ ν•˜λ©΄ 항상 더 "μžμ—°μŠ€λŸ¬μš΄" ν•¨μˆ˜ μ •μ˜μ˜ κΈ°λ³Έ 인수λ₯Ό μœ μ§€ν•  수 μžˆμŠ΅λ‹ˆλ‹€.

λΆ„λ₯˜ μž‘μ—…μ—μ„œ `top_k` λ§€κ°œλ³€μˆ˜κ°€ λŒ€ν‘œμ μΈ μ˜ˆμž…λ‹ˆλ‹€.

```python
>>> pipe = pipeline("my-new-task")
>>> pipe("This is a test")
[{"label": "1-star", "score": 0.8}, {"label": "2-star", "score": 0.1}, {"label": "3-star", "score": 0.05}
{"label": "4-star", "score": 0.025}, {"label": "5-star", "score": 0.025}]

>>> pipe("This is a test", top_k=2)
[{"label": "1-star", "score": 0.8}, {"label": "2-star", "score": 0.1}]
```

이λ₯Ό λ‹¬μ„±ν•˜κΈ° μœ„ν•΄ μš°λ¦¬λŠ” `postprocess` λ©”μ†Œλ“œλ₯Ό κΈ°λ³Έ λ§€κ°œλ³€μˆ˜μΈ `5`둜 μ—…λ°μ΄νŠΈν•˜κ³  `_sanitize_parameters`λ₯Ό μˆ˜μ •ν•˜μ—¬ 이 μƒˆ λ§€κ°œλ³€μˆ˜λ₯Ό ν—ˆμš©ν•©λ‹ˆλ‹€.


```python
def postprocess(self, model_outputs, top_k=5):
    best_class = model_outputs["logits"].softmax(-1)
    # top_kλ₯Ό μ²˜λ¦¬ν•˜λŠ” 둜직 μΆ”κ°€
    return best_class


def _sanitize_parameters(self, **kwargs):
    preprocess_kwargs = {}
    if "maybe_arg" in kwargs:
        preprocess_kwargs["maybe_arg"] = kwargs["maybe_arg"]

    postprocess_kwargs = {}
    if "top_k" in kwargs:
        postprocess_kwargs["top_k"] = kwargs["top_k"]
    return preprocess_kwargs, {}, postprocess_kwargs
```

μž…/좜λ ₯을 κ°€λŠ₯ν•œ ν•œ κ°„λ‹¨ν•˜κ³  μ™„μ „νžˆ JSON 직렬화 κ°€λŠ₯ν•œ ν˜•μ‹μœΌλ‘œ μœ μ§€ν•˜λ €κ³  λ…Έλ ₯ν•˜μ‹­μ‹œμ˜€.
μ΄λ ‡κ²Œ ν•˜λ©΄ μ‚¬μš©μžκ°€ μƒˆλ‘œμš΄ μ’…λ₯˜μ˜ 개체λ₯Ό μ΄ν•΄ν•˜μ§€ μ•Šκ³ λ„ νŒŒμ΄ν”„λΌμΈμ„ μ‰½κ²Œ μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
λ˜ν•œ μ‚¬μš© μš©μ΄μ„±μ„ μœ„ν•΄ μ—¬λŸ¬ 가지 μœ ν˜•μ˜ 인수(μ˜€λ””μ˜€ νŒŒμΌμ€ 파일 이름, URL λ˜λŠ” μˆœμˆ˜ν•œ λ°”μ΄νŠΈμΌ 수 있음)λ₯Ό μ§€μ›ν•˜λŠ” 것이 비ꡐ적 μΌλ°˜μ μž…λ‹ˆλ‹€.



## μ§€μ›λ˜λŠ” μž‘μ—… λͺ©λ‘μ— μΆ”κ°€ν•˜κΈ° [[adding-it-to-the-list-of-supported-tasks]]

`new-task`λ₯Ό μ§€μ›λ˜λŠ” μž‘μ—… λͺ©λ‘μ— λ“±λ‘ν•˜λ €λ©΄ `PIPELINE_REGISTRY`에 μΆ”κ°€ν•΄μ•Ό ν•©λ‹ˆλ‹€:

```python
from transformers.pipelines import PIPELINE_REGISTRY

PIPELINE_REGISTRY.register_pipeline(
    "new-task",
    pipeline_class=MyPipeline,
    pt_model=AutoModelForSequenceClassification,
)
```

μ›ν•˜λŠ” 경우 κΈ°λ³Έ λͺ¨λΈμ„ 지정할 수 있으며, 이 경우 νŠΉμ • κ°œμ •(λΆ„κΈ° 이름 λ˜λŠ” 컀밋 ν•΄μ‹œμΌ 수 있음, μ—¬κΈ°μ„œλŠ” "abcdef")κ³Ό νƒ€μž…μ„ ν•¨κ»˜ 가져와야 ν•©λ‹ˆλ‹€:

```python
PIPELINE_REGISTRY.register_pipeline(
    "new-task",
    pipeline_class=MyPipeline,
    pt_model=AutoModelForSequenceClassification,
    default={"pt": ("user/awesome_model", "abcdef")},
    type="text",  # ν˜„μž¬ 지원 μœ ν˜•: text, audio, image, multimodal
)
```

## Hub에 νŒŒμ΄ν”„λΌμΈ κ³΅μœ ν•˜κΈ° [[share-your-pipeline-on-the-hub]]

Hub에 μ‚¬μš©μž μ •μ˜ νŒŒμ΄ν”„λΌμΈμ„ κ³΅μœ ν•˜λ €λ©΄ `Pipeline` ν•˜μœ„ 클래슀의 μ‚¬μš©μž μ •μ˜ μ½”λ“œλ₯Ό Python νŒŒμΌμ— μ €μž₯ν•˜κΈ°λ§Œ ν•˜λ©΄ λ©λ‹ˆλ‹€.
예λ₯Ό λ“€μ–΄, λ‹€μŒκ³Ό 같이 λ¬Έμž₯ 쌍 λΆ„λ₯˜λ₯Ό μœ„ν•œ μ‚¬μš©μž μ •μ˜ νŒŒμ΄ν”„λΌμΈμ„ μ‚¬μš©ν•œλ‹€κ³  κ°€μ •ν•΄ λ³΄κ² μŠ΅λ‹ˆλ‹€:

```py
import numpy as np

from transformers import Pipeline


def softmax(outputs):
    maxes = np.max(outputs, axis=-1, keepdims=True)
    shifted_exp = np.exp(outputs - maxes)
    return shifted_exp / shifted_exp.sum(axis=-1, keepdims=True)


class PairClassificationPipeline(Pipeline):
    def _sanitize_parameters(self, **kwargs):
        preprocess_kwargs = {}
        if "second_text" in kwargs:
            preprocess_kwargs["second_text"] = kwargs["second_text"]
        return preprocess_kwargs, {}, {}

    def preprocess(self, text, second_text=None):
        return self.tokenizer(text, text_pair=second_text, return_tensors=self.framework)

    def _forward(self, model_inputs):
        return self.model(**model_inputs)

    def postprocess(self, model_outputs):
        logits = model_outputs.logits[0].numpy()
        probabilities = softmax(logits)

        best_class = np.argmax(probabilities)
        label = self.model.config.id2label[best_class]
        score = probabilities[best_class].item()
        logits = logits.tolist()
        return {"label": label, "score": score, "logits": logits}
```

κ΅¬ν˜„μ€ ν”„λ ˆμž„μ›Œν¬μ— ꡬ애받지 μ•ŠμœΌλ©°, PyTorch와 TensorFlow λͺ¨λΈμ— λŒ€ν•΄ μž‘λ™ν•©λ‹ˆλ‹€.
이λ₯Ό `pair_classification.py`λΌλŠ” νŒŒμΌμ— μ €μž₯ν•œ 경우, λ‹€μŒκ³Ό 같이 κ°€μ Έμ˜€κ³  등둝할 수 μžˆμŠ΅λ‹ˆλ‹€:

```py
from pair_classification import PairClassificationPipeline
from transformers.pipelines import PIPELINE_REGISTRY
from transformers import AutoModelForSequenceClassification, TFAutoModelForSequenceClassification

PIPELINE_REGISTRY.register_pipeline(
    "pair-classification",
    pipeline_class=PairClassificationPipeline,
    pt_model=AutoModelForSequenceClassification,
    tf_model=TFAutoModelForSequenceClassification,
)
```

이 μž‘μ—…μ΄ μ™„λ£Œλ˜λ©΄ μ‚¬μ „ν›ˆλ ¨λœ λͺ¨λΈκ³Ό ν•¨κ»˜ μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
예λ₯Ό λ“€μ–΄, `sgugger/finetuned-bert-mrpc`은 MRPC 데이터 μ„ΈνŠΈμ—μ„œ λ―Έμ„Έ μ‘°μ •λ˜μ–΄ λ¬Έμž₯ μŒμ„ νŒ¨λŸ¬ν”„λ ˆμ΄μ¦ˆμΈμ§€ μ•„λ‹Œμ§€λ₯Ό λΆ„λ₯˜ν•©λ‹ˆλ‹€.

```py
from transformers import pipeline

classifier = pipeline("pair-classification", model="sgugger/finetuned-bert-mrpc")
```

그런 λ‹€μŒ `Repository`의 `save_pretrained` λ©”μ†Œλ“œλ₯Ό μ‚¬μš©ν•˜μ—¬ ν—ˆλΈŒμ— κ³΅μœ ν•  수 μžˆμŠ΅λ‹ˆλ‹€:

```py
from huggingface_hub import Repository

repo = Repository("test-dynamic-pipeline", clone_from="{your_username}/test-dynamic-pipeline")
classifier.save_pretrained("test-dynamic-pipeline")
repo.push_to_hub()
```

μ΄λ ‡κ²Œ ν•˜λ©΄ "test-dynamic-pipeline" 폴더 내에 `PairClassificationPipeline`을 μ •μ˜ν•œ 파일이 λ³΅μ‚¬λ˜λ©°, νŒŒμ΄ν”„λΌμΈμ˜ λͺ¨λΈκ³Ό ν† ν¬λ‚˜μ΄μ €λ„ μ €μž₯ν•œ ν›„, `{your_username}/test-dynamic-pipeline` μ €μž₯μ†Œμ— μžˆλŠ” λͺ¨λ“  것을 ν‘Έμ‹œν•©λ‹ˆλ‹€.
μ΄ν›„μ—λŠ” `trust_remote_code=True` μ˜΅μ…˜λ§Œ μ œκ³΅ν•˜λ©΄ λˆ„κ΅¬λ‚˜ μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.

```py
from transformers import pipeline

classifier = pipeline(model="{your_username}/test-dynamic-pipeline", trust_remote_code=True)
```

## πŸ€— Transformers에 νŒŒμ΄ν”„λΌμΈ μΆ”κ°€ν•˜κΈ° [[add-the-pipeline-to-transformers]]

πŸ€— Transformers에 μ‚¬μš©μž μ •μ˜ νŒŒμ΄ν”„λΌμΈμ„ κΈ°μ—¬ν•˜λ €λ©΄, `pipelines` ν•˜μœ„ λͺ¨λ“ˆμ— μ‚¬μš©μž μ •μ˜ νŒŒμ΄ν”„λΌμΈ μ½”λ“œμ™€ ν•¨κ»˜ μƒˆ λͺ¨λ“ˆμ„ μΆ”κ°€ν•œ λ‹€μŒ, `pipelines/__init__.py`μ—μ„œ μ •μ˜λœ μž‘μ—… λͺ©λ‘μ— μΆ”κ°€ν•΄μ•Ό ν•©λ‹ˆλ‹€.

그런 λ‹€μŒ ν…ŒμŠ€νŠΈλ₯Ό μΆ”κ°€ν•΄μ•Ό ν•©λ‹ˆλ‹€.
`tests/test_pipelines_MY_PIPELINE.py`λΌλŠ” μƒˆ νŒŒμΌμ„ λ§Œλ“€κ³  λ‹€λ₯Έ ν…ŒμŠ€νŠΈμ™€ 예제λ₯Ό ν•¨κ»˜ μž‘μ„±ν•©λ‹ˆλ‹€.

`run_pipeline_test` ν•¨μˆ˜λŠ” 맀우 일반적이며, `model_mapping` 및 `tf_model_mapping`μ—μ„œ μ •μ˜λœ κ°€λŠ₯ν•œ λͺ¨λ“  μ•„ν‚€ν…μ²˜μ˜ μž‘μ€ λ¬΄μž‘μœ„ λͺ¨λΈμ—μ„œ μ‹€ν–‰λ©λ‹ˆλ‹€.

μ΄λŠ” ν–₯ν›„ ν˜Έν™˜μ„±μ„ ν…ŒμŠ€νŠΈν•˜λŠ” 데 맀우 μ€‘μš”ν•˜λ©°, λˆ„κ΅°κ°€ `XXXForQuestionAnswering`을 μœ„ν•œ μƒˆ λͺ¨λΈμ„ μΆ”κ°€ν•˜λ©΄ νŒŒμ΄ν”„λΌμΈ ν…ŒμŠ€νŠΈκ°€ ν•΄λ‹Ή λͺ¨λΈμ—μ„œ 싀행을 μ‹œλ„ν•œλ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
λͺ¨λΈμ΄ λ¬΄μž‘μœ„μ΄κΈ° λ•Œλ¬Έμ— μ‹€μ œ 값을 ν™•μΈν•˜λŠ” 것은 λΆˆκ°€λŠ₯ν•˜λ―€λ‘œ, λ‹¨μˆœνžˆ νŒŒμ΄ν”„λΌμΈ 좜λ ₯ `TYPE`κ³Ό μΌμΉ˜μ‹œν‚€κΈ° μœ„ν•œ λ„μš°λ―Έ `ANY`κ°€ μžˆμŠ΅λ‹ˆλ‹€.

λ˜ν•œ 2개(μ΄μƒμ μœΌλ‘œλŠ” 4개)의 ν…ŒμŠ€νŠΈλ₯Ό κ΅¬ν˜„ν•΄μ•Ό ν•©λ‹ˆλ‹€.

- `test_small_model_pt`: 이 νŒŒμ΄ν”„λΌμΈμ— λŒ€ν•œ μž‘μ€ λͺ¨λΈ 1개λ₯Ό μ •μ˜(κ²°κ³Όκ°€ 의미 없어도 μƒκ΄€μ—†μŒ)ν•˜κ³  νŒŒμ΄ν”„λΌμΈ 좜λ ₯을 ν…ŒμŠ€νŠΈν•©λ‹ˆλ‹€.
κ²°κ³ΌλŠ” `test_small_model_tf`와 동일해야 ν•©λ‹ˆλ‹€.
- `test_small_model_tf`: 이 νŒŒμ΄ν”„λΌμΈμ— λŒ€ν•œ μž‘μ€ λͺ¨λΈ 1개λ₯Ό μ •μ˜(κ²°κ³Όκ°€ 의미 없어도 μƒκ΄€μ—†μŒ)ν•˜κ³  νŒŒμ΄ν”„λΌμΈ 좜λ ₯을 ν…ŒμŠ€νŠΈν•©λ‹ˆλ‹€.
κ²°κ³ΌλŠ” `test_small_model_pt`와 동일해야 ν•©λ‹ˆλ‹€.
- `test_large_model_pt`(`선택사항`): κ²°κ³Όκ°€ 의미 μžˆμ„ κ²ƒμœΌλ‘œ μ˜ˆμƒλ˜λŠ” μ‹€μ œ νŒŒμ΄ν”„λΌμΈμ—μ„œ νŒŒμ΄ν”„λΌμΈμ„ ν…ŒμŠ€νŠΈν•©λ‹ˆλ‹€.
μ΄λŸ¬ν•œ ν…ŒμŠ€νŠΈλŠ” 속도가 λŠλ¦¬λ―€λ‘œ 이λ₯Ό ν‘œμ‹œν•΄μ•Ό ν•©λ‹ˆλ‹€.
μ—¬κΈ°μ„œμ˜ λͺ©ν‘œλŠ” νŒŒμ΄ν”„λΌμΈμ„ 보여주고 ν–₯ν›„ λ¦΄λ¦¬μ¦ˆμ—μ„œμ˜ λ³€ν™”κ°€ μ—†λŠ”μ§€ ν™•μΈν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
- `test_large_model_tf`(`선택사항`): κ²°κ³Όκ°€ 의미 μžˆμ„ κ²ƒμœΌλ‘œ μ˜ˆμƒλ˜λŠ” μ‹€μ œ νŒŒμ΄ν”„λΌμΈμ—μ„œ νŒŒμ΄ν”„λΌμΈμ„ ν…ŒμŠ€νŠΈν•©λ‹ˆλ‹€.
μ΄λŸ¬ν•œ ν…ŒμŠ€νŠΈλŠ” 속도가 λŠλ¦¬λ―€λ‘œ 이λ₯Ό ν‘œμ‹œν•΄μ•Ό ν•©λ‹ˆλ‹€.
μ—¬κΈ°μ„œμ˜ λͺ©ν‘œλŠ” νŒŒμ΄ν”„λΌμΈμ„ 보여주고 ν–₯ν›„ λ¦΄λ¦¬μ¦ˆμ—μ„œμ˜ λ³€ν™”κ°€ μ—†λŠ”μ§€ ν™•μΈν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.