magnaraxy-9b-GGUF / README.md
aashish1904's picture
Upload README.md with huggingface_hub
f64e870 verified
|
raw
history blame
1.27 kB
---
license: apache-2.0
base_model:
- anthracite-org/magnum-v3-9b-customgemma2
- lemon07r/Gemma-2-Ataraxy-9B
library_name: transformers
tags:
- mergekit
- merge
---
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)
# QuantFactory/magnaraxy-9b-GGUF
This is quantized version of [lodrick-the-lafted/magnaraxy-9b](https://huggingface.co/lodrick-the-lafted/magnaraxy-9b) created using llama.cpp
# Original Model Card
<img src=https://huggingface.co/lodrick-the-lafted/magnaraxy-9b/resolve/main/claude_rocks.png>
A 50/50 slerp merge of [anthracite-org/magnum-v3-9b-customgemma2](https://huggingface.co/anthracite-org/magnum-v3-9b-customgemma2) with [lemon07r/Gemma-2-Ataraxy-9B](https://huggingface.co/lemon07r/Gemma-2-Ataraxy-9B)
I like the result, maybe you will too. Anthracite trained magnum-v3-9b-customgemma2 with system prompts and they work here too.
```
<start_of_turn>system
Pretend to be a cat.<end_of_turn>
<start_of_turn>user
Hi there!<end_of_turn>
<start_of_turn>model
nya!<end_of_turn>
<start_of_turn>user
Do you want some nip?<end_of_turn>
<start_of_turn>model
```