Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ tags:
|
|
16 |
- merge
|
17 |
|
18 |
---
|
19 |
-
# Joah-Llama-3-KoEn-8B-Coder-v1
|
20 |
|
21 |
<a href="https://ibb.co/8XPkwP8"><img src="https://i.ibb.co/kMqZTqc/Joah.png" alt="Joah" border="0"></a><br />
|
22 |
|
@@ -24,14 +24,14 @@ tags:
|
|
24 |
|
25 |
"μ’μ(Joah)" by AsianSoul
|
26 |
|
27 |
-
Soon Multi Language Model Merge based on this. First German Start (Korean / English / German)
|
28 |
|
29 |
-
Where to use Joah : Medical, Korean, English, Translation, Code, Science...
|
30 |
|
31 |
-
## Merge Details
|
32 |
|
33 |
|
34 |
-
The performance of this merge model doesn't seem to be bad though.-> Just opinion
|
35 |
|
36 |
This may not be a model that satisfies you. But if we continue to overcome our shortcomings,
|
37 |
|
@@ -50,11 +50,11 @@ I have found that most of merge's model outside so far do not actually have 64k
|
|
50 |
If you support me, i will try it on a computer with maximum specifications, also, i would like to conduct great tests by building a network with high-capacity traffic and high-speed 10G speeds for you.
|
51 |
|
52 |
|
53 |
-
### Merge Method
|
54 |
|
55 |
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [NousResearch/Meta-Llama-3-8B](https://huggingface.co/NousResearch/Meta-Llama-3-8B) as a base.
|
56 |
|
57 |
-
### Models Merged
|
58 |
|
59 |
The following models were included in the merge:
|
60 |
* [beomi/Llama-3-KoEn-8B-Instruct-preview](https://huggingface.co/beomi/Llama-3-KoEn-8B-Instruct-preview)
|
@@ -67,7 +67,7 @@ The following models were included in the merge:
|
|
67 |
* [asiansoul/Llama-3-Open-Ko-Linear-8B](https://huggingface.co/asiansoul/Llama-3-Open-Ko-Linear-8B)
|
68 |
* [aaditya/Llama3-OpenBioLLM-8B](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B)
|
69 |
|
70 |
-
### Ollama
|
71 |
|
72 |
Modelfile_Q5_K_M
|
73 |
```
|
@@ -102,7 +102,7 @@ ollama create joah -f ./Modelfile_Q5_K_M
|
|
102 |
Modelfile_Q5_K_M default, i hope you to test many upload file for my repo to change that and create ollama
|
103 |
|
104 |
|
105 |
-
### Configuration
|
106 |
|
107 |
The following YAML configuration was used to produce this model:
|
108 |
|
|
|
16 |
- merge
|
17 |
|
18 |
---
|
19 |
+
# π· Joah-Llama-3-KoEn-8B-Coder-v1
|
20 |
|
21 |
<a href="https://ibb.co/8XPkwP8"><img src="https://i.ibb.co/kMqZTqc/Joah.png" alt="Joah" border="0"></a><br />
|
22 |
|
|
|
24 |
|
25 |
"μ’μ(Joah)" by AsianSoul
|
26 |
|
27 |
+
Soon Multi Language Model Merge based on this. First German Start (Korean / English / German) π
|
28 |
|
29 |
+
Where to use Joah : Medical, Korean, English, Translation, Code, Science... π₯
|
30 |
|
31 |
+
## π‘ Merge Details
|
32 |
|
33 |
|
34 |
+
The performance of this merge model doesn't seem to be bad though.-> Just opinion^^ ποΈ
|
35 |
|
36 |
This may not be a model that satisfies you. But if we continue to overcome our shortcomings,
|
37 |
|
|
|
50 |
If you support me, i will try it on a computer with maximum specifications, also, i would like to conduct great tests by building a network with high-capacity traffic and high-speed 10G speeds for you.
|
51 |
|
52 |
|
53 |
+
### π§Ά Merge Method
|
54 |
|
55 |
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [NousResearch/Meta-Llama-3-8B](https://huggingface.co/NousResearch/Meta-Llama-3-8B) as a base.
|
56 |
|
57 |
+
### π Models Merged
|
58 |
|
59 |
The following models were included in the merge:
|
60 |
* [beomi/Llama-3-KoEn-8B-Instruct-preview](https://huggingface.co/beomi/Llama-3-KoEn-8B-Instruct-preview)
|
|
|
67 |
* [asiansoul/Llama-3-Open-Ko-Linear-8B](https://huggingface.co/asiansoul/Llama-3-Open-Ko-Linear-8B)
|
68 |
* [aaditya/Llama3-OpenBioLLM-8B](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B)
|
69 |
|
70 |
+
### πΉ Ollama
|
71 |
|
72 |
Modelfile_Q5_K_M
|
73 |
```
|
|
|
102 |
Modelfile_Q5_K_M default, i hope you to test many upload file for my repo to change that and create ollama
|
103 |
|
104 |
|
105 |
+
### π Configuration
|
106 |
|
107 |
The following YAML configuration was used to produce this model:
|
108 |
|