--- base_model: - ZeusLabs/Chronos-Platinum-72B - EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1 - m8than/banana-2-b-72b - abacusai/Dracarys2-72B-Instruct - rombodawg/Rombos-LLM-V2.5-Qwen-72b - Qwen/Qwen2.5-72B library_name: transformers tags: - mergekit - merge --- # EurobeatVARemix-Qwen2.5-72b [![image/png](https://cdn-uploads.huggingface.co/production/uploads/633e85093a17ab61de8d9073/UqQ-TJ8ZgHk02zvO7Oy11.png)](https://www.youtube.com/watch?v=1gW1uHRPChc) Updated EVA to 0.1. That's all folks! ...It didn't feel right calling it LLENN anymore so I'm changing the name. ["Pray I don't alter it any further."]() **Please do not ask for quants, contact others instead.** *All models are ready for testing on [featherless.ai](https://featherless.ai) as soon as it goes live.* ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Qwen/Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) as a base. ### Prompt Format ChatML works for the most part. ### Sampler Settings Personally I use the following: ``` Temp: 1.2 Min P: 0.07 Rep Pen: 1.1 ``` Others have suggested the following: ``` Temp: 1.1 Top P: 0.98 Min P: 0.05 ``` ### Models Merged The following models were included in the merge: * [ZeusLabs/Chronos-Platinum-72B](https://huggingface.co/ZeusLabs/Chronos-Platinum-72B) * [EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1) * [m8than/banana-2-b-72b](https://huggingface.co/m8than/banana-2-b-72b) * [abacusai/Dracarys2-72B-Instruct](https://huggingface.co/abacusai/Dracarys2-72B-Instruct) * [rombodawg/Rombos-LLM-V2.5-Qwen-72b](https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-72b) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1 - model: ZeusLabs/Chronos-Platinum-72B - model: abacusai/Dracarys2-72B-Instruct - model: rombodawg/Rombos-LLM-V2.5-Qwen-72b - model: m8than/banana-2-b-72b merge_method: model_stock base_model: Qwen/Qwen2.5-72B parameters: normalize: true dtype: bfloat16 ```