JiyangZhang commited on
Commit
3d23018
·
verified ·
1 Parent(s): f8beda1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +78 -3
README.md CHANGED
@@ -1,3 +1,78 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ ---
4
+
5
+ # exLong
6
+ exLong is a large language model instruction-tuned from CodeLlama and embeds reasoning about traces that lead to throw statements, conditional expressions that guard throw statements, and non-exceptional behavior tests that execute similar traces.
7
+
8
+ The model is fine-tuned from CodeLlama-7b-Instruct using LoRA.
9
+
10
+
11
+ | Size| Base Model | Providing EBT name in the prompt | Do not provide EBT name in the prompt |
12
+ | --- | ----------------------------------------------------------------------------- | ------------------------------------- | ---------------------------------------------------------|
13
+ | 7B | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) | `revision="with-etest-name" | `revision="no-etest-name" |
14
+
15
+ ## Model Use
16
+
17
+ ```bash
18
+ pip install transformers accelerate bitsandbytes peft
19
+ ```
20
+
21
+ ```python
22
+ from transformers import AutoModelForCausalLM, AutoTokenizer
23
+ from peft import PeftModel, PeftConfig
24
+
25
+ # Load the base model
26
+ base_model_name = "codellama/CodeLlama-7b-Instruct-hf"
27
+ base_model = AutoModelForCausalLM.from_pretrained(base_model_name)
28
+
29
+ # Load the LoRA configuration
30
+ peft_model_id = "EngineeringSoftware/exLong"
31
+ config = PeftConfig.from_pretrained(peft_model_id, revision="with-etest-name") # set revision to "no-etest-name" for no EBT name
32
+
33
+ # Load the LoRA model
34
+ model = PeftModel.from_pretrained(base_model, peft_model_id)
35
+ tokenizer = AutoTokenizer.from_pretrained(base_model_name)
36
+
37
+ prompt = """<s>[INST] <<SYS>>
38
+ You are a helpful programming assistant and an expert Java programmer. You are helping a user writing exceptional-behavior tests for their Java code.
39
+ <</SYS>>
40
+
41
+ Please complete an exceptional behavior test method in Java to test the method 'factorial' for the exception 'IllegalArgumentException'.
42
+ The method to be tested is defined as:
43
+ ```java
44
+ public static long factorial(int n) {
45
+ if (n < 0) {
46
+ throw new IllegalArgumentException("Number must be non-negative.");
47
+ }
48
+ long result = 1;
49
+ for (int i = 1; i <= n; i++) {
50
+ result *= i;
51
+ }
52
+ return result;
53
+ }
54
+ ` ` `
55
+ Please only give the new exceptional-behavior test method to complete the following test class. Do NOT use extra libraries or define new helper methods. Return **only** the code in the completion:
56
+ ```java
57
+ public class FactorialTest {
58
+ }
59
+ ` ` `
60
+ """
61
+
62
+ input_ids = tokenizer(prompt, return_tensors="pt").input_ids
63
+
64
+ # Generate code
65
+ output = model.generate(
66
+ input_ids=input_ids,
67
+ max_new_tokens=100,
68
+ temperature=0.2, # Sampling temperature (lower is more deterministic)
69
+ top_p=0.95, # Top-p (nucleus) sampling
70
+ do_sample=True # Enable sampling
71
+ )
72
+
73
+ # Decode and print the generated code
74
+ generated_code = tokenizer.decode(output[0], skip_special_tokens=True)
75
+ print("Generated Code:")
76
+ print(generated_code)
77
+ ```
78
+