Update README.md
Browse files
README.md
CHANGED
@@ -9,6 +9,8 @@ tags:
|
|
9 |
---
|
10 |
# Jotun_flam
|
11 |
|
|
|
|
|
12 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
13 |
|
14 |
## Merge Details
|
|
|
9 |
---
|
10 |
# Jotun_flam
|
11 |
|
12 |
+
This model performs alright, but suffers from one minor error and one larger error that I may get around to correcting eventually. Be aware that I inadvertently skipped Layer 20 from the original SLERPed model (minor) and I merged two bf16 models using fp16 (major), getting the worst of both worlds. This MAY be inconsequiential at Q4 and lower quants, but I am unsure. This m
|
13 |
+
|
14 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
15 |
|
16 |
## Merge Details
|