MongoDB Query Generator - Llama-3.2-1B (Fine-tuned)
- Developed by: skshmjn
- License: apache-2.0
- Finetuned from model: unsloth/Llama-3.2-1B-Instruct
- Dataset Used: skshmjn/mongodb-chat-query
- Supports: Transformers & GGUF (for fast inference on CPU/GPU)
🚀 Model Overview
This model is designed to generate MongoDB queries from natural language prompts. It supports:
- Basic CRUD operations:
find
,insert
,update
,delete
- Aggregation Pipelines:
$group
,$match
,$lookup
,$sort
, etc. - Indexing & Performance Queries
- Nested Queries & Joins (
$lookup
)
Trained using Unsloth for efficient fine-tuning and GGUF quantization for fast inference.
📌 Example Usage (Transformers)
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "skshmjn/Llama-3.2-1B-Mongo-Instruct"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
prompt = "Find all employees older than 30 in the 'employees' collection."
inputs = tokenizer(prompt, return_tensors="pt")
output = model.generate(**inputs, max_length=100)
query = tokenizer.decode(output[0], skip_special_tokens=True)
print(query)
- Downloads last month
- 269
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for skshmjn/Llama-3.2-1B-Mongo-Instruct
Base model
meta-llama/Llama-3.2-1B-Instruct
Finetuned
unsloth/Llama-3.2-1B-Instruct