Update README.md
Browse files
README.md
CHANGED
@@ -11,41 +11,8 @@ tags:
|
|
11 |
---
|
12 |
# merge
|
13 |
|
14 |
-
This is a merge of pre-trained language models
|
15 |
|
16 |
-
|
17 |
-
### Merge Method
|
18 |
|
19 |
-
|
20 |
-
|
21 |
-
### Models Merged
|
22 |
-
|
23 |
-
The following models were included in the merge:
|
24 |
-
* [anthracite-org/magnum-v4-22b](https://huggingface.co/anthracite-org/magnum-v4-22b)
|
25 |
-
* [TheDrummer/Cydonia-22B-v1.2](https://huggingface.co/TheDrummer/Cydonia-22B-v1.2)
|
26 |
-
|
27 |
-
### Configuration
|
28 |
-
|
29 |
-
The following YAML configuration was used to produce this model:
|
30 |
-
|
31 |
-
```yaml
|
32 |
-
models:
|
33 |
-
- model: Gryphe/Pantheon-RP-1.6.2-22b-Small
|
34 |
-
parameters:
|
35 |
-
density: 0.5
|
36 |
-
weight: 0.5
|
37 |
-
- model: TheDrummer/Cydonia-22B-v1.2
|
38 |
-
parameters:
|
39 |
-
density: 0.5
|
40 |
-
weight: 0.5
|
41 |
-
- model: anthracite-org/magnum-v4-22b
|
42 |
-
parameters:
|
43 |
-
density: 0.5
|
44 |
-
weight: 0.5
|
45 |
-
merge_method: dare_ties
|
46 |
-
base_model: Gryphe/Pantheon-RP-1.6.2-22b-Small
|
47 |
-
parameters:
|
48 |
-
normalize: false
|
49 |
-
int8_mask: true
|
50 |
-
dtype: float16
|
51 |
-
```
|
|
|
11 |
---
|
12 |
# merge
|
13 |
|
14 |
+
This is a merge of pre-trained language models
|
15 |
|
16 |
+
We all know panteon, cydonia and magnum. Just was interessd what will be if merge them together.
|
|
|
17 |
|
18 |
+
It is probably not broken? i will update this after tests.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|