|
--- |
|
base_model: [] |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
|
|
# Miquella 120B |
|
## Model has been remade with the [fixed dequantization](https://huggingface.co/152334H/miqu-1-70b-sf) of miqu. |
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
An attempt at re-creating [goliath-120b](https://huggingface.co/alpindale/goliath-120b) using the new miqu-1-70b model instead of Xwin. |
|
|
|
The merge ratios are the same as goliath, only that Xwin is swapped with miqu. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [miqu-1-70b](https://huggingface.co/152334H/miqu-1-70b-sf) |
|
* [Euryale-1.3-L2-70B](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) |
|
|
|
 |
|
Miquella the Unalloyed, by @eldrtchmoon |
|
|