File size: 1,850 Bytes
480dbff
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
297adc1
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
license: cc-by-nc-sa-4.0
language:
- ja
base_model:
- llm-jp/llm-jp-3-13b
---

# Fine-tuned Japanese Instruction Model

This is a fine-tuned version of the base model **[llm-jp/llm-jp-3-13b](https://huggingface.co/llm-jp/llm-jp-3-13b)** using the **ichikara-instruction** dataset.  
The model has been fine-tuned for **Japanese instruction-following tasks**.

---

## Model Information

### **Base Model**
- **Model**: [llm-jp/llm-jp-3-13b](https://huggingface.co/llm-jp/llm-jp-3-13b)  
- **Architecture**: Causal Language Model  
- **Parameters**: 13 billion  

### **Fine-tuning Dataset**
- **Dataset**: [ichikara-instruction](https://liat-aip.sakura.ne.jp/wp/llmのための日本語インストラクションデータ作成/)
- **Authors**: 関根聡, 安藤まや, 後藤美知子, 鈴木久美, 河原大輔, 井之上直也, 乾健太郎  
- **License**: [CC-BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)

The dataset includes Japanese instruction-response pairs and has been tailored for Japanese **instruction-following tasks**.

関根聡, 安藤まや, 後藤美知子, 鈴木久美, 河原大輔, 井之上直也, 乾健太郎. ichikara-instruction: LLMのための日本語インストラクションデータの構築. 言語処理学会第30回年次大会(2024)

---

## License

This model is released under the **CC-BY-NC-SA 4.0** license.

- **Base Model**: [llm-jp/llm-jp-3-13b](https://huggingface.co/llm-jp/llm-jp-3-13b) (Apache License 2.0)
- **Fine-Tuning Dataset**: ichikara-instruction (CC-BY-NC-SA 4.0)

**Fine-tuned Model License**:  
Due to the Share-Alike (SA) condition of the ichikara-instruction dataset, the fine-tuned model is licensed under **CC-BY-NC-SA 4.0**.  
This means the model can only be used for **non-commercial purposes**, and any derivative works must adopt the same license.