File size: 1,512 Bytes
d132394
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- unsloth/mistral-7b-v0.2
- mistralai/Mistral-7B-Instruct-v0.2
- quantized
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- chatml
base_model:
- unsloth/mistral-7b-v0.2
- mistralai/Mistral-7B-Instruct-v0.2
pipeline_tag: text-generation
inference: false
quantized_by: Suparious
---
# NeuralNovel/Mini-Mixtral-v0.2 AWQ

- Model creator: [NeuralNovel](https://huggingface.co/NeuralNovel)
- Original model: [Mini-Mixtral-v0.2](https://huggingface.co/NeuralNovel/Mini-Mixtral-v0.2)

![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/645cfe4603fc86c46b3e46d1/DOoAs2yzNOUC465BSM9-s.jpeg)

## Model Summary

Mini-Mixtral-v0.2 is a Mixture of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [unsloth/mistral-7b-v0.2](https://huggingface.co/unsloth/mistral-7b-v0.2)
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)

<a href='https://ko-fi.com/S6S2UH2TC' target='_blank'><img height='38' style='border:0px;height:36px;' src='https://storage.ko-fi.com/cdn/kofi1.png?v=3' border='0' alt='Buy Me a Coffee at ko-fi.com' /></a>
<a href='https://discord.gg/GtkUUP2qJE' target='_blank'><img width='140' height='500' style='border:0px;height:36px;' src='https://i.ibb.co/tqwznYM/Discord-button.png' border='0' alt='Join Our Discord!' /></a>