sureal01's picture
Create README.md
c56660a verified
metadata
tags:
  - javascript
  - code-generation
  - transformers
  - fine-tuned
  - distilgpt2
license: mit
library_name: transformers

πŸš€ DistilGPT-2 Code Generator (Explanation β†’ JavaScript Code)

This model is a fine-tuned version of distilgpt2 trained to generate JavaScript code from natural language explanations.

It was trained on a dataset containing explanation-code pairs, making it useful for:
βœ… Code generation from text descriptions
βœ… Learning JavaScript syntax & patterns
βœ… Automated coding assistance


πŸ›  Model Details

  • Base Model: distilgpt2 (6x smaller than GPT-2)
  • Dataset: JavaScript explanations + corresponding functions
  • Fine-tuning: Trained using LoRA (memory-efficient adaptation)
  • Training Environment: Google Colab (T4 GPU)
  • Optimization: FP16 precision for faster training

πŸ“Š Example Usage

Load the model and generate JavaScript code from explanations:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "sureal01/distilgpt2-code-generator"  # Replace with your username
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

def generate_code(explanation):
    input_text = f"### Explanation:\n{explanation}\n\n### Generate JavaScript code:\n"
    inputs = tokenizer(input_text, return_tensors="pt")

    output = model.generate(**inputs, max_length=150, temperature=0.5, top_p=0.9, repetition_penalty=1.5)
    return tokenizer.decode(output[0], skip_special_tokens=True)

# Example
test_explanation = "This function takes a name as input and returns a greeting message."
generated_code = generate_code(test_explanation)
print("\nπŸ”Ή **Generated Code:**\n", generated_code)