beomi/KoAlpaca-RealQA-Solar-Ko-Recovery-11B-Q8_0-GGUF
This LoRA adapter was converted to GGUF format from beomi/KoAlpaca-RealQA-Solar-Ko-Recovery-11B
via the ggml.ai's GGUF-my-lora space.
Refer to the original adapter repository for more details.
Use with llama.cpp
# with cli
llama-cli -m base_model.gguf --lora KoAlpaca-RealQA-Solar-Ko-Recovery-11B-q8_0.gguf (...other args)
# with server
llama-server -m base_model.gguf --lora KoAlpaca-RealQA-Solar-Ko-Recovery-11B-q8_0.gguf (...other args)
To know more about LoRA usage with llama.cpp server, refer to the llama.cpp server documentation.
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for beomi/KoAlpaca-RealQA-Solar-Ko-Recovery-11B-Q8_0-GGUF
Base model
upstage/SOLAR-10.7B-v1.0
Finetuned
beomi/Solar-Ko-Recovery-11B