lorablated?

#2
by maldv - opened

It's a decent model, but I feel a pain in my side when I see you using the lorablated as the base model instead of adding the lora on at the end. 😡

the base model is used as a base measurement to compare models when merging, lorablated only means it has been stripped of refusal which gets added back in once they get merged just at a lower rate.

Not quite. The base model is subtracted from each of the other models to discover the fine tuning task vectors before the theta "singular angles" are calculated. By subtracting out the lorablated-tuned model you are merging the weights after removing the no refusals, and then adding the result back in; so the models are slightly twisted. If you subtract out the actual base model, you just get the pure task vectors; and since the lorablated model is really just the application of mlabonne/Llama-3-70B-Instruct-abliterated-LORA, you could just do that after the fact.

Your correct, I should have said subtracted but the slight twisting is actually intentional in this model. it has a unique interaction between the weights that I liked from the original Astoria model and achieves a different outcome than what you'd get from applying the components sequentially as my goal is not to remove refusals as with the original ablated.

When comparing the versions of the original Astoria (ablated base vs normal base) I found I liked the replies more than just a normal Model stock and carried it through to this model.

I think I see. It's an interesting thought, that it might in a way perform some sort of lobotomy on areas related to the difference between the base and the lorablated models.

I think I see. It's an interesting thought, that it might in a way perform some sort of lobotomy on areas related to the difference between the base and the lorablated models.

That's the hope, but its honestly hard to say. I would need a large amount of people to A | B test the models and see if there is really a difference or just perceived differences on my end.

I appreciate the input tho.

Sign up or log in to comment