--- library_name: transformers tags: [] --- # Model Card for Model ID ## Model Details ### Model Description This is a finetuned version of Google's code-gemma-7b model on text-to-sql task on 1000 samples of lamini/bird_text_to_sql dataset. #### Training Hyperparameters - Epochs=1 - optimizer=paged_adamw_8bit - learning_rate=0.0002 - warmup_ratio=0.05 - lr_scheduler_type=linear - weight_decay=0.01 - max_seq_length=512 - lora_rank=128 - lora_alpha=32 - lora_dropout=0.1