File size: 1,024 Bytes
5cca337 7195bf1 5cca337 a491453 7195bf1 67ba0f2 7195bf1 3c2c64d 7195bf1 a7ef681 7195bf1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
language:
- en
pipeline_tag: text-generation
license: cc-by-nc-4.0
---
# AISquare-Instruct-yi-ko-6b-v0.9.30
## Model Details
**Developed by**
[Inswave Systems](https://www.inswave.com) UI Platform Team
**Method**
Using DPO method and SFT method
**Hardware**
We utilized an A100x4 * 1 for training our model
**Base Model**
[beomi/Yi-Ko-6B](https://huggingface.co/beomi/Yi-Ko-6B)
## Open ko-leaderboard Rank
<img src='./ko-leaderboard.png' width=512>
# Implementation Code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
repo = "zomd/AISquare-Instruct-yi-ko-6b-v0.9.30"
model = AutoModelForCausalLM.from_pretrained(
repo,
return_dict=True,
torch_dtype=torch.float16,
device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)
```
μ΄ μ°κ΅¬λ μΈκ³΅μ§λ₯μ°μ
μ΅ν©μ¬μ
λ¨(AICA)μμ μΆμ§ν γμΈκ³΅μ§λ₯ μ€μ¬ μ°μ
μ΅ν© μ§μ λ¨μ§ μ‘°μ±μ¬μ
γ μ μ§μμ λ°μ μ§νλ κ²°κ³Όμ
λλ€.
---
|