File size: 1,438 Bytes
fb7c190
 
 
 
 
 
 
 
 
 
 
 
390acf0
 
 
fb7c190
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
---
license: llama2
language:
- en
pipeline_tag: conversational
tags:
- Xwin
- WinterGoddess
- frankenmerge
- 120b
---
# Xwinter 120B

<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/gs7KMVn4__G_JuksiuJIm.png" width=600>

A Goliath-120b style frankenmerge of Xwin-LM-70b-v0.1 and WinterGoddess-1.4x-70b. Meant as a slight update to Goliath by using the successor model to Euryale.

There's a similar merge called [WinterGoliath-123b](https://huggingface.co/ChuckMcSneed/WinterGoliath-123b) by [@ChuckMcSneed](https://huggingface.co/ChuckMcSneed).

# Prompting Format
Vicuna and Alpaca.

# Merge process
The models used in the merge are [Xwin-LM-70b-v0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).

The layer mix:
```yaml
- range 0, 16
  Xwin
- range 8, 24
  WinterGoddess
- range 17, 32
  Xwin
- range 25, 40
  WinterGoddess
- range 33, 48
  Xwin
- range 41, 56
  WinterGoddess
- range 49, 64
  Xwin
- range 57, 72
  WinterGoddess
- range 65, 80
  Xwin
```

# Acknowledgements
[@Xwin-LM](https://huggingface.co/Xwin-LM) For creating Xwin

[@Sao10K](https://huggingface.co/Sao10K) For creating WinterGoddess

[@alpindale](https://huggingface.co/alpindale) For creating the original Goliath

[@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).