File size: 722 Bytes
7c21a9a f793678 7c21a9a f793678 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
license: apache-2.0
datasets:
- Open-Orca/SlimOrca
language:
- en
---
### TeeZee/GALAXY-XB-v1.03 ###
Experiment, can DUS be taken one or more steps further?
### Technical notes:
- model v03 finetuned on 50k entries from SlimOrca dataset
- 12 layers removed from both models, 4 more than in original paper but its 1/4 of all layers(48) as per original paper.
- base version of upstage/SOLAR-10.7B-v1.0 used for merge
- no finetuning done yet, this is just a merge, first step in DUS paper
- next step, if evaluation proves that its at least as 'smart' as base model, should be finetuning to 'recover' after merge
#To evaluate
- model performance after finetuning, did it recover initial permofmance loss after merge? |