sql-generator / README.md
kampkelly's picture
Update README.md
f4dc264 verified
metadata
library_name: transformers
tags:
  - text-generation
  - text-generation-inference
  - Inference Endpoints
license: mit
datasets:
  - omeryentur/text-to-postgresql
language:
  - en
metrics:
  - rouge
pipeline_tag: text2text-generation

Model Card for Model ID

Model Details

Model Description

This model is fine-trained from the google/flan-t5-base model to achieve better accuracy on generating SQL Queries. It has been trained to generate sql queries given a question and database schema(s).

It can be used in any of such applications where sql queries are needed (particularly Postgres queries).

This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: Oghenerunor Adjekpiyede
  • Model type: Text2TextGeneration
  • Language(s) (NLP): English
  • License: MIT
  • Finetuned from model [optional]: google/flan-t5-base

Model Sources [optional]

Uses

This model is to be used and performs well for generating SQL queries. This model for other tasks may not give satisfactory performance on generating text in other general use cases.

Direct Use

Use with transformers

from peft import PeftModel
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

model_base = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-base", torch_dtype=torch.bfloat16, trust_remote_code=True)
model = PeftModel.from_pretrained(model_base,
    peft_model_path,
    torch_dtype=torch.bfloat16,
    is_trainable=False)

input_ids = tokenizer(prompt, padding="max_length", max_length=300, truncation=True, return_tensors="pt").input_ids
model_output = model.generate(input_ids=input_ids, max_new_tokens = 300, use_cache = True,
                                                num_beams=3,
                                                do_sample=True,
                                                top_k=50,
                                                top_p=0.75,
                                                temperature=0.1,
                                                early_stopping=True
                                            )
model_text_output = tokenizer.decode(model_output[0], skip_special_tokens=True)
print(model_text)

Bias, Risks, and Limitations

This model is particularly good for generating SQL Select statement queries. Other types of query statements such as Create, Delete, Update, etc are not fully supported.