G-MATRIX-MX-7B / README.md
wgkim's picture
Update README.md
557554b verified
metadata
license: cc-by-nc-nd-4.0
language:
  - en
  - ko
datasets:
  - bi-matrix/nl2json

Model Card for bi-matrix/g-matrix-mx7b

Introduction of BI MATRIX Company

https://www.bimatrix.co.kr/

We are a group of experts with extensive knowledge and experience in Big Data Analytics, Visualization, Mobile Analytics, and Data Mining.

Model Summary

Our model leverages the capabilities of the large language model "Mistral-7B" for natural language processing. By fine-tuning with the "mistralai/Mistral-7B-Instruct-v0.2" model, it's specifically optimized to convert natural language into JSON format for the "i-META" solution.

How to Use

Here are some examples of how to use our model.

import torch
from transformers import pipeline

model = 'bi-matrix/g-matrix-mx7b'
message = """
<|system|>
์งˆ๋ฌธ์„ json ์œผ๋กœ ๋งŒ๋“ค์–ด์ฃผ์„ธ์š”.

<|user|>
์ด๋ฒˆ๋‹ฌ ์šฐ๋ฆฌ๋ถ€์„œ์˜ ์ œํ’ˆ๋ผ์ธ๋ณ„ ๋งค์ถœ ์‹ค์ ๊ณผ ์‹คํ–‰๋Œ€๋น„ ๋‹ฌ์„ฑ์œจ์„ ์•Œ๋ ค์ค˜

<|assistant|>
"""

pipe = pipeline(
    model=model,
    torch_dtype=torch.float16,
    device_map="auto"
)

sequences = pipe(
    message,
    do_sample=True,
    top_k=3,
    num_return_sequences=1,
    max_length=2048,
)

for seq in sequences:
    print(f"Result: {seq['generated_text']}")

Warnings

Our model, "bi-matrix/g-matrix-mx7b," underwent fine-tuning for the task of converting natural language into the request JSON format for the "i-META" solution. Consequently, it may not be suitable for general use.

Contact

If you have any questions, please contact at [email protected].