File size: 584 Bytes
f19b28a
 
 
 
9d10085
 
 
 
 
 
 
2ea4c0a
9d10085
cf1aa9a
 
 
 
 
 
96ee769
 
 
 
12526f3
 
 
23ce9c0
12526f3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
license: cc-by-nc-4.0
language:
- ko
pipeline_tag: text-generation
tags:
- meta
- llama-2
- llama-2-ko
---

## Model Details

**Model Architecture:** 

urLLM-KO-7B is an auto-regressive language model that leverages an optimized transformer architecture derived from Llama-2-7b.

**Training Corpus**

The model was trained using selected datasets from Modu Corpus and Korean Wikipedia (approximately 28GB).

**Vocab Expansion**

The expanded vocab size is 51385.

**Model Card Contact**

For errors or additional questions about details in this model card, contact [email protected] .