FULL RELEASE!!!

introducing MFANN (Makhi's fully autonomous neural network) a family of COT models fine-tuned on a modified Alpaca training regiment involving defining a "thought-process" in the dataset in each sample, allowing the model to generate reasoning tokens before the model generates output (yes this has been in the works MUCH longer than o1 has been out!)

this is the smaller 3b model based on phi-2

the 8b model based on an abliterated llama 3.1 8b will be coming in the next few days!! stay tuned!!

If you would like to support me and keep this project going well into the future (or even free me up from the horrors of little caesars so i can work on this more lmfao) please consider donating to my patreon!! https://www.patreon.com/c/MakhiBurroughs all members get to vote on what gets done next with MFANN, addtional perks are currently in the works!

system prompt:

Instruct: {instruction} Output:

Downloads last month
281
Safetensors
Model size
2.78B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for netcat420/MFANN3b

Finetuned
(1)
this model
Quantizations
7 models

Dataset used to train netcat420/MFANN3b