--- base_model: unsloth/Qwen2.5-Coder-7B-Instruct-bnb-4bit library_name: transformers model_name: onekq-ai/OneSQL-v0.1-Qwen-7B tags: - generated_from_trainer - unsloth - trl - sft licence: apache-2.0 pipeline_tag: text-generation --- # Introduction This model specializes on the Text-to-SQL task. It is finetuned from the quantized version of [Qwen2.5-Coder-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct). Its sibling [32B model](https://huggingface.co/onekq-ai/OneSQL-v0.1-Qwen-32B) has an EX score of **63.33** and R-VES score of **60.02** on the [BIRD leaderboard](https://bird-bench.github.io/). The self-evaluation EX score of this model is **56.19**. # Quick start To use this model, craft your prompt to start with your database schema in the form of **CREATE TABLE**, followed by your natural language query preceded by **--**. Make sure your prompt ends with **SELECT** in order for the model to finish the query for you. ```python from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline from peft import PeftModel model_name = "unsloth/Qwen2.5-Coder-7B-Instruct-bnb-4bit" adapter_name = "onekq-ai/OneSQL-v0.1-Qwen-7B" tokenizer = AutoTokenizer.from_pretrained(model_name) tokenizer.padding_side = "left" model = PeftModel.from_pretrained(AutoModelForCausalLM.from_pretrained(model_name, device_map="auto"), adapter_name).to("cuda") generator = pipeline("text-generation", model=model, tokenizer=tokenizer, return_full_text=False) prompt = """ CREATE TABLE students ( id INTEGER PRIMARY KEY, name TEXT, age INTEGER, grade TEXT ); -- Find the three youngest students SELECT """ result = generator(f"<|im_start|>system\nYou are a SQL expert. Return code only.<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant\n")[0] print(result["generated_text"]) ``` The model response is the finished SQL query without **SELECT** ```sql * FROM students ORDER BY age ASC LIMIT 3 ```