mergekit-uploader's picture
Upload folder using huggingface_hub
6f8a1fd verified
---
base_model:
- tannedbum/L3-Nymeria-v2-8B
- surya-narayanan/professional_psychology
- DreadPoor/Suavemente-8B-Model_Stock
- tannedbum/L3-Nymeria-v2-8B
- Azazelle/Llama3-RP-Lora
- tannedbum/L3-Nymeria-v2-8B
- surya-narayanan/sociology
- tannedbum/L3-Nymeria-v2-8B
- ResplendentAI/Smarts_Llama3
- tannedbum/L3-Nymeria-v2-8B
- DreadPoor/OpenBioLLM-8B-r64-LoRA
- tannedbum/L3-Nymeria-v2-8B
- kik41/lora-type-narrative-llama-3-8b-v2
- tannedbum/L3-Nymeria-v2-8B
- surya-narayanan/formal_logic
- tannedbum/L3-Nymeria-v2-8B
- surya-narayanan/health
- tannedbum/L3-Nymeria-v2-8B
- vincentyandex/lora_llama3_chunked_novel_bs128
- tannedbum/L3-Nymeria-v2-8B
- Azazelle/Llama-3-8B-Abomination-LORA
- tannedbum/L3-Nymeria-v2-8B
- kik41/lora-length-long-llama-3-8b-v2
- tannedbum/L3-Nymeria-v2-8B
- surya-narayanan/anatomy
- tannedbum/L3-Nymeria-v2-8B
- surya-narayanan/human_sexuality
- tannedbum/L3-Nymeria-v2-8B
- kik41/lora-type-descriptive-llama-3-8b-v2
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [DreadPoor/Suavemente-8B-Model_Stock](https://huggingface.co/DreadPoor/Suavemente-8B-Model_Stock) as a base.
### Models Merged
The following models were included in the merge:
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/professional_psychology](https://huggingface.co/surya-narayanan/professional_psychology)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [Azazelle/Llama3-RP-Lora](https://huggingface.co/Azazelle/Llama3-RP-Lora)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/sociology](https://huggingface.co/surya-narayanan/sociology)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [ResplendentAI/Smarts_Llama3](https://huggingface.co/ResplendentAI/Smarts_Llama3)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [DreadPoor/OpenBioLLM-8B-r64-LoRA](https://huggingface.co/DreadPoor/OpenBioLLM-8B-r64-LoRA)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [kik41/lora-type-narrative-llama-3-8b-v2](https://huggingface.co/kik41/lora-type-narrative-llama-3-8b-v2)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/formal_logic](https://huggingface.co/surya-narayanan/formal_logic)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/health](https://huggingface.co/surya-narayanan/health)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [vincentyandex/lora_llama3_chunked_novel_bs128](https://huggingface.co/vincentyandex/lora_llama3_chunked_novel_bs128)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [Azazelle/Llama-3-8B-Abomination-LORA](https://huggingface.co/Azazelle/Llama-3-8B-Abomination-LORA)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [kik41/lora-length-long-llama-3-8b-v2](https://huggingface.co/kik41/lora-length-long-llama-3-8b-v2)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/anatomy](https://huggingface.co/surya-narayanan/anatomy)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/human_sexuality](https://huggingface.co/surya-narayanan/human_sexuality)
* [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [kik41/lora-type-descriptive-llama-3-8b-v2](https://huggingface.co/kik41/lora-type-descriptive-llama-3-8b-v2)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/formal_logic
- model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/health
- model: tannedbum/L3-Nymeria-v2-8B+vincentyandex/lora_llama3_chunked_novel_bs128
- model: tannedbum/L3-Nymeria-v2-8B+kik41/lora-type-narrative-llama-3-8b-v2
- model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/sociology
- model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/professional_psychology
- model: tannedbum/L3-Nymeria-v2-8B+ResplendentAI/Smarts_Llama3
- model: tannedbum/L3-Nymeria-v2-8B+Azazelle/Llama3-RP-Lora
- model: tannedbum/L3-Nymeria-v2-8B+DreadPoor/OpenBioLLM-8B-r64-LoRA
- model: tannedbum/L3-Nymeria-v2-8B+Azazelle/Llama-3-8B-Abomination-LORA
- model: tannedbum/L3-Nymeria-v2-8B+kik41/lora-type-descriptive-llama-3-8b-v2
- model: tannedbum/L3-Nymeria-v2-8B+kik41/lora-length-long-llama-3-8b-v2
- model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/anatomy
- model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/human_sexuality
merge_method: model_stock
base_model: DreadPoor/Suavemente-8B-Model_Stock
dtype: float32
```