Usage
from llama_cpp import Llama
llm = Llama.from_pretrained(
repo_id="krishkpatil/legal_llm",
filename="unsloth.Q4_K_M.gguf", # Replace with the actual GGUF filename if different
)
response = llm.create_chat_completion(
messages = [
{
"role": "user",
"content": "Explain the concept of judicial review in India."
}
]
)
print(response['choices'][0]['message']['content'])
Uploaded model
- Developed by: krishkpatil
- License: apache-2.0
- Finetuned from model : unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
- Downloads last month
- 55
Model tree for krishkpatil/legal_llm
Base model
meta-llama/Meta-Llama-3-8B
Quantized
unsloth/llama-3-8b-bnb-4bit