File size: 1,455 Bytes
1acf5f7
f02edcc
 
 
 
 
 
 
 
 
1acf5f7
 
f02edcc
1acf5f7
f02edcc
1acf5f7
f02edcc
1acf5f7
f02edcc
 
 
1acf5f7
f02edcc
1acf5f7
f02edcc
 
 
 
 
 
 
 
 
1acf5f7
f02edcc
1acf5f7
f02edcc
 
 
1acf5f7
f02edcc
1acf5f7
f02edcc
1acf5f7
f02edcc
 
 
1acf5f7
f02edcc
1acf5f7
f02edcc
 
1acf5f7
4ac403b
 
1acf5f7
f02edcc
 
 
 
 
1acf5f7
f02edcc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
---
language: en
license: mit
tags:
  - code
  - python
  - assistant
  - causal-lm
  - streamlit
pipeline_tag: text-generation
---

# 🧠 Python Code Assistant (Fine-tuned CodeGen 350M)

This model is a fine-tuned version of `Salesforce/codegen-350M-multi` designed to assist with Python code generation based on natural language prompts.

## 🧪 Example Prompt

```
Write a Python function to check if a number is prime.
```

## ✅ Example Output

```python
def is_prime(n):
    if n < 2:
        return False
    for i in range(2, int(n ** 0.5) + 1):
        if n % i == 0:
            return False
    return True
```

## 🛠️ Intended Use

- Educational coding help
- Rapid prototyping in notebooks or IDEs
- Integration with Streamlit apps

> 🚫 Not intended to replace formal code review or secure programming practices.

## 🔍 Model Details

- Base: `Salesforce/codegen-350M-multi`
- Training: Fine-tuned on 500+ Python instruction-completion pairs
- Format: causal LM

## 🧰 How to Use

```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("AhsanFarabi/python-assistant")
tokenizer = AutoTokenizer.from_pretrained("AhsanFarabi/python-assistant")

prompt = "Write a function to reverse a string."
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=128)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```

---