metadata
language:
- en
license: apache-2.0
library_name: transformers
tags:
- code
- QA
- reasoning
Model Card for Model ID
Model Details
Model Description
A powerfull MOE 4x7b mixtral of mistral models build using HuggingFaceH4/zephyr-7b-beta mistralai/Mistral-7B-Instruct-v0.2 teknium/OpenHermes-2.5-Mistral-7B Intel/neural-chat-7b-v3-3 for more accuracy and precision in general reasoning, QA and code.
- Developed by: NEXT AI
- Funded by : Zpay Labs Pvt Ltd.
- Model type: Mixtral of Mistral 4x7b
- Language(s) (NLP): Code-Reasoning-QA
Model Sources
- Demo : Https://nextai.co.in