Update README.md
Browse files
README.md
CHANGED
@@ -9,6 +9,7 @@ A Transformer-based code fixing model trained on diverse buggy β fixed code pa
|
|
9 |
- **Languages**: Python (initially), can be expanded to JS, Go, etc.
|
10 |
|
11 |
## π§ Intended Use
|
|
|
12 |
Input a buggy function or script and receive a syntactically and semantically corrected version.
|
13 |
|
14 |
**Example**:
|
@@ -20,35 +21,3 @@ def add(x, y)
|
|
20 |
# Output:
|
21 |
def add(x, y):
|
22 |
return x + y
|
23 |
-
```
|
24 |
-
|
25 |
-
## π§ How it Works
|
26 |
-
The model learns from training examples that map erroneous code to corrected code. It uses token-level sequence generation to predict patches.
|
27 |
-
|
28 |
-
## π Inference
|
29 |
-
Use `transformers` pipeline or run via CLI:
|
30 |
-
```python
|
31 |
-
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
32 |
-
model = AutoModelForSeq2SeqLM.from_pretrained("YOUR_USERNAME/aifixcode-model")
|
33 |
-
tokenizer = AutoTokenizer.from_pretrained("YOUR_USERNAME/aifixcode-model")
|
34 |
-
input_code = "def foo(x):\n print(x"
|
35 |
-
inputs = tokenizer(input_code, return_tensors="pt")
|
36 |
-
out = model.generate(**inputs, max_length=512)
|
37 |
-
print(tokenizer.decode(out[0], skip_special_tokens=True))
|
38 |
-
```
|
39 |
-
|
40 |
-
## π Dataset Format
|
41 |
-
```json
|
42 |
-
[
|
43 |
-
{
|
44 |
-
"input": "def add(x, y)\n return x + y",
|
45 |
-
"output": "def add(x, y):\n return x + y"
|
46 |
-
}
|
47 |
-
]
|
48 |
-
```
|
49 |
-
|
50 |
-
## π‘οΈ License
|
51 |
-
MIT License
|
52 |
-
|
53 |
-
## π Acknowledgements
|
54 |
-
Built using π€ HuggingFace Transformers + Salesforce CodeT5.
|
|
|
9 |
- **Languages**: Python (initially), can be expanded to JS, Go, etc.
|
10 |
|
11 |
## π§ Intended Use
|
12 |
+
|
13 |
Input a buggy function or script and receive a syntactically and semantically corrected version.
|
14 |
|
15 |
**Example**:
|
|
|
21 |
# Output:
|
22 |
def add(x, y):
|
23 |
return x + y
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|