metadata
license: apache-2.0
TeeZee/GALAXY-XB-v.03
Experiment, can DUF can be taken one or more steps further?
Technical notes:
- 12 layers removed from both models, 4 more than in original paper but its 1/4 of all layers(48) as per original paper.
- base version of upstage/SOLAR-10.7B-v1.0 used for merge
- no finetuning done yet, this is just a merge, first step in DUF paper
- next step, if evaluation proves that its at least as 'smart' as base model, should be finetuning to 'recover' after merge