Description

Mistral-7b model finetuned using SFT to respond like Johhny Silverhand from the Cyberpunk universe (Mike Pondsmith, CD Project Red).

The model can be used with ollama or any other application that supports a .gguf model file.

The model is quantized to 8 bits using llama.cpp

Downloads last month
4
GGUF
Model size
7.24B params
Architecture
llama

8-bit

Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for jubba/silverhand_mistral7b_v0_1_GGUF

Quantized
(89)
this model