aashish1904 commited on
Commit
6f70bbd
·
verified ·
1 Parent(s): b1b0999

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ library_name: transformers
5
+ license: apache-2.0
6
+ base_model: mistralai/Mistral-Nemo-Instruct-2407
7
+ datasets:
8
+ - Saxo/ko_cn_translation_tech_social_science_linkbricks_single_dataset
9
+ - Saxo/ko_jp_translation_tech_social_science_linkbricks_single_dataset
10
+ - Saxo/en_ko_translation_tech_science_linkbricks_single_dataset_with_prompt_text_huggingface
11
+ - Saxo/en_ko_translation_social_science_linkbricks_single_dataset_with_prompt_text_huggingface
12
+ - Saxo/ko_aspect_sentiment_sns_mall_sentiment_linkbricks_single_dataset_with_prompt_text_huggingface
13
+ - Saxo/ko_summarization_linkbricks_single_dataset_with_prompt_text_huggingface
14
+ - Saxo/OpenOrca_cleaned_kor_linkbricks_single_dataset_with_prompt_text_huggingface
15
+ - Saxo/ko_government_qa_total_linkbricks_single_dataset_with_prompt_text_huggingface_sampled
16
+ - Saxo/ko-news-corpus-1
17
+ - Saxo/ko-news-corpus-2
18
+ - Saxo/ko-news-corpus-3
19
+ - Saxo/ko-news-corpus-4
20
+ - Saxo/ko-news-corpus-5
21
+ - Saxo/ko-news-corpus-6
22
+ - Saxo/ko-news-corpus-7
23
+ - Saxo/ko-news-corpus-8
24
+ - Saxo/ko-news-corpus-9
25
+ - maywell/ko_Ultrafeedback_binarized
26
+ - youjunhyeok/ko-orca-pair-and-ultrafeedback-dpo
27
+ - lilacai/glaive-function-calling-v2-sharegpt
28
+ - kuotient/gsm8k-ko
29
+ language:
30
+ - ko
31
+ - en
32
+ - jp
33
+ - cn
34
+ pipeline_tag: text-generation
35
+
36
+ ---
37
+
38
+ ![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)
39
+
40
+ # QuantFactory/Linkbricks-Horizon-AI-Korean-Advanced-12B-GGUF
41
+ This is quantized version of [Saxo/Linkbricks-Horizon-AI-Korean-Advanced-12B](https://huggingface.co/Saxo/Linkbricks-Horizon-AI-Korean-Advanced-12B) created using llama.cpp
42
+
43
+ # Original Model Card
44
+
45
+
46
+ # Model Card for Model ID
47
+
48
+ <div align="center">
49
+ <img src="https://www.linkbricks.com/wp-content/uploads/2022/03/%E1%84%85%E1%85%B5%E1%86%BC%E1%84%8F%E1%85%B3%E1%84%87%E1%85%B3%E1%84%85%E1%85%B5%E1%86%A8%E1%84%89%E1%85%B3%E1%84%85%E1%85%A9%E1%84%80%E1%85%A9-2-1024x804.png" />
50
+ </div>
51
+
52
+
53
+ AI 와 빅데이터 분석 전문 기업인 Linkbricks의 데이터사이언티스트인 지윤성(Saxo) 이사가 <br>
54
+ Mistral-Nemo-Instruct-2407 베이스모델을 사용해서 H100-80G 8개를 통해 CPT(Continue-Pretraining)->SFP->DPO 한 한글 언어 모델<br>
55
+ 천만건의 한글 뉴스 코퍼스를 기준으로 다양한 테스크별 한국어-중국어-영어-일본어 교차 학습 데이터와 수학 및 논리판단 데이터를 통하여 한중일영 언어 교차 증강 처리와 복잡한 논리 문제 역시 대응 가능하도록 훈련한 모델이다.<br>
56
+ -토크나이저는 단어 확장 없이 베이스 모델 그대로 사용<br>
57
+ -고객 리뷰나 소셜 포스팅 고차원 분석 및 코딩과 작문, 수학, 논리판단 등이 강화된 모델<br>
58
+ -128k-Context Window<br>
59
+ -한글 Function Call 및 Tool Calling 지원 <br>
60
+ -Deepspeed Stage=3, rslora 및 BAdam Layer Mode 사용 <br><br><br>
61
+
62
+ Finetuned by Mr. Yunsung Ji (Saxo), a data scientist at Linkbricks, a company specializing in AI and big data analytics <br>
63
+ CPT(Continue-Pretraining)->SFP->DPO training model based on Mistral-Nemo-Instruct-2407 through 8 H100-80Gs as a Korean language model <br>
64
+ It is a model that has been trained to handle Korean-Chinese-English-Japanese cross-training data and 10M korean news corpus and logic judgment data for various tasks to enable cross-fertilization processing and complex Korean logic & math problems. <br>
65
+ -Tokenizer uses the base model without word expansion<br>
66
+ -Models enhanced with high-dimensional analysis of customer reviews and social posts, as well as coding, writing, amth and decision making<br>
67
+ -128k-Context Window<br>
68
+ -Support for Korean Functioncall and Tool Calling<br>
69
+ -Deepspeed Stage=3, use rslora and BAdam Layer Mode<br>
70
+ <br><br>
71
+
72
+ <a href="www.linkbricks.com">www.linkbricks.com</a>, <a href="www.linkbricks.vc">www.linkbricks.vc</a>
73
+