If I need a model that surpasses llama3 8b in all aspects, how do you think I should merge it to reach 24b?
· Sign up or log in to comment