File size: 530 Bytes
7c21a9a
f793678
 
 
 
 
 
7c21a9a
 
 
 
 
 
 
 
 
305b86e
 
477caaf
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: apache-2.0
datasets:
- Open-Orca/SlimOrca
language:
- en
---
### TeeZee/GALAXY-XB-v1.03 ###

Experiment, can DUS be taken one or more steps further?

### Technical notes:
- model v03 finetuned on 50k entries from SlimOrca dataset
- 12 layers removed from both models, 4 more than in original paper but its 1/4 of all layers(48) as per original paper.
- base version of upstage/SOLAR-10.7B-v1.0 used for merge


### To evaluate
- model performance after finetuning, did it recover initial performance loss after merge?