SlimOrca Mixtral 8x7B

Built with Axolotl

OpenOrca Logo

Official release of the SlimOrca Mixtral finetune. More details to come.

Model Details

Model Description

  • Developed by: OpenAccess AI Collective and OpenOrca
  • Finetuned from model [optional]: mistralai/Mixtral-8x7B-v0.1
Downloads last month
23
Safetensors
Model size
46.7B params
Tensor type
BF16
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Open-Orca/Mixtral-SlimOrca-8x7B

Finetuned
(57)
this model
Adapters
1 model
Quantizations
3 models

Dataset used to train Open-Orca/Mixtral-SlimOrca-8x7B

Space using Open-Orca/Mixtral-SlimOrca-8x7B 1