Marcoroni-neural-chat-7B-v2

Model Details

This model is a merge of models AIDC-ai-business/Marcoroni-7B-v3 and Intel/neural-chat-7b-v3-3 using Slerp with mergekit tool for testing purposes. Both models are based on mistralai/Mistral-7B-v0.1.

Downloads last month
1,907
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Toten5/Marcoroni-neural-chat-7B-v2

Adapters
9 models
Quantizations
3 models