Contributors: Nicolas Mejia Petit

License

Mistral 29b: A New Base Model

The objective of this model is to serve as a new fully open source base model with 29.2 billion parameters.

This model spits out jargon, and needs to be fine tuned, either with qlora, with the adapter attached to every layer, or better yet a full fine tune.

Model Creation

The model was created by stacking four models: Dolphin, Zephyr, Meta-math7b, and Speechless code, to form a single model.

Useful Resources

Source Models

Downloads last month
21
Safetensors
Model size
28.2B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.