ProdeusUnity commited on
Commit
799e86d
1 Parent(s): 6377063

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -64
README.md CHANGED
@@ -1,64 +1,68 @@
1
- ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
7
-
8
- ---
9
- # Astral-Fusion-8b-v0.0
10
-
11
- *Join my dream, it's just the right time, whoa... Leave it all behind... Get ready now... Riise up into my world~*
12
-
13
- Listen to the song on youtube: https://www.youtube.com/watch?v=npyiiInMA0w
14
-
15
- Another attempt at a merge, not entirely related to Stellar Odyssey. I like it, so try it out?
16
-
17
- Merged Models:
18
-
19
- - meta-llama/Llama-3-8b-Instruct
20
- - Sao10K_L3-8B-Stheno-v3.2
21
- - Gryphe_Pantheon-RP-1.0-8b-Llama-3
22
- - Celeste-Stable-v1.2
23
-
24
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
25
-
26
- ## Merge Details
27
- ### Merge Method
28
-
29
- This model was merged using the della_linear merge method using C:\Users\\Downloads\Mergekit-Fixed\mergekit\meta-llama_Llama-3-8B-Instruct as a base.
30
-
31
- ### Models Merged
32
-
33
- The following models were included in the merge:
34
- * C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.0-8b-Llama-3
35
- * C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_L3-8B-Stheno-v3.2
36
- * C:\Users\\Downloads\Mergekit-Fixed\mergekit\Celeste-Stable-v1.2-Test2
37
-
38
- ### Configuration
39
-
40
- The following YAML configuration was used to produce this model:
41
-
42
- ```yaml
43
- models:
44
- - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_L3-8B-Stheno-v3.2
45
- parameters:
46
- weight: 0.3
47
- density: 0.25
48
- - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Celeste-Stable-v1.2-Test2
49
- parameters:
50
- weight: 0.1
51
- density: 0.4
52
- - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.0-8b-Llama-3
53
- parameters:
54
- weight: 0.4
55
- density: 0.5
56
- merge_method: della_linear
57
- base_model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\meta-llama_Llama-3-8B-Instruct
58
- parameters:
59
- epsilon: 0.05
60
- lambda: 1
61
- merge_method: della_linear
62
- dtype: bfloat16
63
-
64
- ```
 
 
 
 
 
1
+ ---
2
+ base_model: []
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - merge
7
+
8
+ ---
9
+ # Astral-Fusion-8b-v0.0
10
+
11
+ *Join my dream, it's just the right time, whoa... Leave it all behind... Get ready now... Riise up into my world~*
12
+
13
+ Listen to the song on youtube: https://www.youtube.com/watch?v=npyiiInMA0w
14
+
15
+ Another attempt at a merge, not entirely related to Stellar Odyssey. I like it, so try it out?
16
+
17
+ Merged Models:
18
+
19
+ - meta-llama/Llama-3-8b-Instruct
20
+ - Sao10K_L3-8B-Stheno-v3.2
21
+ - Gryphe_Pantheon-RP-1.0-8b-Llama-3
22
+ - Celeste-Stable-v1.2
23
+
24
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
25
+
26
+ ## Edit: Celeste v1.2 Stable?
27
+
28
+ That itself is a merge, more to stablize Celeste since its training was at 256. It was merged with NeuralDareDevil via TIES
29
+
30
+ ## Merge Details
31
+ ### Merge Method
32
+
33
+ This model was merged using the della_linear merge method using C:\Users\\Downloads\Mergekit-Fixed\mergekit\meta-llama_Llama-3-8B-Instruct as a base.
34
+
35
+ ### Models Merged
36
+
37
+ The following models were included in the merge:
38
+ * C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.0-8b-Llama-3
39
+ * C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_L3-8B-Stheno-v3.2
40
+ * C:\Users\\Downloads\Mergekit-Fixed\mergekit\Celeste-Stable-v1.2-Test2
41
+
42
+ ### Configuration
43
+
44
+ The following YAML configuration was used to produce this model:
45
+
46
+ ```yaml
47
+ models:
48
+ - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_L3-8B-Stheno-v3.2
49
+ parameters:
50
+ weight: 0.3
51
+ density: 0.25
52
+ - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Celeste-Stable-v1.2-Test2
53
+ parameters:
54
+ weight: 0.1
55
+ density: 0.4
56
+ - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.0-8b-Llama-3
57
+ parameters:
58
+ weight: 0.4
59
+ density: 0.5
60
+ merge_method: della_linear
61
+ base_model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\meta-llama_Llama-3-8B-Instruct
62
+ parameters:
63
+ epsilon: 0.05
64
+ lambda: 1
65
+ merge_method: della_linear
66
+ dtype: bfloat16
67
+
68
+ ```