sometimesanotion
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -6,16 +6,11 @@ library_name: transformers
|
|
6 |
tags:
|
7 |
- mergekit
|
8 |
- merge
|
9 |
-
|
10 |
---
|
11 |
-
#
|
12 |
-
|
13 |
-
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
14 |
-
|
15 |
-
## Merge Details
|
16 |
-
### Merge Method
|
17 |
|
18 |
-
This
|
19 |
|
20 |
### Models Merged
|
21 |
|
@@ -67,4 +62,4 @@ postprocessing:
|
|
67 |
range: [0.97, 1.03]
|
68 |
kernel_size: 5
|
69 |
|
70 |
-
```
|
|
|
6 |
tags:
|
7 |
- mergekit
|
8 |
- merge
|
9 |
+
license: apache-2.0
|
10 |
---
|
11 |
+
# Notes
|
|
|
|
|
|
|
|
|
|
|
12 |
|
13 |
+
This is an experiment to try these extra SLERP parameters @bamec66557 uses in bamec66557/Qwen-2.5-14B-MINUS, but with the models I'm working on now. Do they make a difference to mergekit-gui? We'll see.
|
14 |
|
15 |
### Models Merged
|
16 |
|
|
|
62 |
range: [0.97, 1.03]
|
63 |
kernel_size: 5
|
64 |
|
65 |
+
```
|