--- base_model: [] library_name: peft tags: - mergekit - peft --- # Umbral-Mind-r128-LoRA This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit). ## LoRA Details This LoRA adapter was extracted from merge/f32-umbral and uses merge/f32-instruct as a base. ### Parameters The following command was used to extract this LoRA adapter: ```sh /usr/local/bin/mergekit-extract-lora --out-path=loras/Umbral-Mind-r128-LoRA --model=merge/f32-umbral --base-model=merge/f32-instruct --no-lazy-unpickle --max-rank=128 --sv-epsilon=0 --cuda --multi-gpu -v ```