|
--- |
|
license: cc-by-nc-nd-4.0 |
|
language: |
|
- en |
|
- ko |
|
datasets: |
|
- bi-matrix/nl2json |
|
--- |
|
|
|
# Model Card for bi-matrix/g-matrix-mx7b |
|
|
|
### Introduction of BI MATRIX Company |
|
|
|
https://www.bimatrix.co.kr/ |
|
|
|
We are a group of experts with extensive knowledge and experience in Big Data Analytics, Visualization, Mobile Analytics, and Data Mining. |
|
|
|
|
|
### Model Summary |
|
Our model leverages the capabilities of the large language model "Mistral-7B" for natural language processing. By fine-tuning with the "mistralai/Mistral-7B-Instruct-v0.2" model, it's specifically optimized to convert natural language into JSON format for the "i-META" solution. |
|
|
|
|
|
### How to Use |
|
Here are some examples of how to use our model. |
|
|
|
```python |
|
import torch |
|
from transformers import pipeline |
|
|
|
model = 'bi-matrix/g-matrix-mx7b' |
|
message = """ |
|
<|system|> |
|
์ง๋ฌธ์ json ์ผ๋ก ๋ง๋ค์ด์ฃผ์ธ์. |
|
|
|
<|user|> |
|
์ด๋ฒ๋ฌ ์ฐ๋ฆฌ๋ถ์์ ์ ํ๋ผ์ธ๋ณ ๋งค์ถ ์ค์ ๊ณผ ์คํ๋๋น ๋ฌ์ฑ์จ์ ์๋ ค์ค |
|
|
|
<|assistant|> |
|
""" |
|
|
|
pipe = pipeline( |
|
model=model, |
|
torch_dtype=torch.float16, |
|
device_map="auto" |
|
) |
|
|
|
sequences = pipe( |
|
message, |
|
do_sample=True, |
|
top_k=3, |
|
num_return_sequences=1, |
|
max_length=2048, |
|
) |
|
|
|
for seq in sequences: |
|
print(f"Result: {seq['generated_text']}") |
|
``` |
|
|
|
### Warnings |
|
Our model, "bi-matrix/g-matrix-mx7b," underwent fine-tuning for the task of converting natural language into the request JSON format for the "i-META" solution. |
|
Consequently, it may not be suitable for general use. |
|
|
|
### Contact |
|
If you have any questions, please contact at [email protected]. |