--- license: apache-2.0 --- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65995c45539c808e84c38bf1/_KhIfo70XNOWvxLScZM-P.png) > ***Hello darkness, my old friend I've come to talk with you again.*** --- **IMPORTANT**: Make sure you have the latest version of [llama.cpp](https://github.com/ggerganov/llama.cpp) to use these: - The [PR that allows multiple control vectors to be loaded](https://github.com/ggerganov/llama.cpp/commit/97877eb10bd8e7f8023420b5b5300bcbdadd62dc) (without zero-padding) just got merged today (27/06/24). - Older versions of `llama.cpp` will just ***silently*** load the first control vector (and none of the others) if the layer index of the final direction in each file does not match... --- To use these control vectors effectively you will need to use the "`--control-vector-scaled`" option like this: ```sh llama-cli --model .gguf --control-vector-scaled -dark.gguf 0.5 --control-vector-scaled -chaos.gguf 0.5 [the rest of your CLI arguments...] ``` or: ```sh llama-cli --model .gguf --control-vector-scaled -dark.gguf 1.0 --control-vector-scaled -chaos.gguf 0.0 [the rest of your CLI arguments...] ``` or: ```sh llama-cli --model .gguf --control-vector-scaled -dark.gguf 0.5 --control-vector-scaled -chaos.gguf 0.25 [the rest of your CLI arguments...] ``` **NOTE:** - Use ***positive scale factors*** to make the model "***more dark***" or "***more chaotic***". - I suggest you use `--control-vector-scaled 0.5` and `--control-vector-scaled 0.5` to start and then test the effect. - The "chaos" control vectors generally seems less effective than the "dark" control vectors. - Some models like `command-r:35b` and `command-r-plus:104b` need lower scale factors, whereas `miqu-1:70b` seems to need (much) higher scale factors to stamp out pesky redemption arcs. - You can use one control vector file alone if you want, or alternatively set the scale factor to `0.0` for traits you don't want to use. - You can use the same "`--control-vector-scaled`" command line arguments for "`llama-server`" as in the above "`llama-cli`" examples.