--- base_model: - shibiyaj/lawGPT-chat - AdityaXPV/Mistral-7B-law-sage-v0.3 library_name: transformers tags: - mergekit - merge --- # merged This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [AdityaXPV/Mistral-7B-law-sage-v0.3](https://huggingface.co/AdityaXPV/Mistral-7B-law-sage-v0.3) as a base. ### Models Merged The following models were included in the merge: * [shibiyaj/lawGPT-chat](https://huggingface.co/shibiyaj/lawGPT-chat) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: AdityaXPV/Mistral-7B-law-sage-v0.3 dtype: float16 merge_method: ties parameters: int8_mask: 1.0 normalize: 0.0 slices: - sources: - layer_range: [0, 32] model: shibiyaj/lawGPT-chat parameters: density: 0.5 weight: 0.5 - layer_range: [0, 32] model: AdityaXPV/Mistral-7B-law-sage-v0.3 ```