frankmorales2020 commited on
Commit
5b4b3ce
·
verified ·
1 Parent(s): d19a3de

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -0
README.md CHANGED
@@ -20,10 +20,64 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the generator dataset.
22
 
 
 
23
  ## Model description
24
 
25
  More information needed
26
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27
  ## Intended uses & limitations
28
 
29
  More information needed
 
20
 
21
  This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the generator dataset.
22
 
23
+ with dataset b-mc2/sql-create-context
24
+
25
  ## Model description
26
 
27
  More information needed
28
 
29
+ ### Testing results
30
+
31
+ import torch
32
+ from peft import AutoPeftModelForCausalLM
33
+ from transformers import AutoTokenizer, pipeline
34
+
35
+ peft_model_id = "frankmorales2020/Mistral-7B-text-to-sql-without-flash-attention-2"
36
+
37
+ # Load Model with PEFT adapter
38
+ model = AutoPeftModelForCausalLM.from_pretrained(
39
+ peft_model_id,
40
+ device_map="auto",
41
+ torch_dtype=torch.float16
42
+ )
43
+
44
+ tokenizer = AutoTokenizer.from_pretrained(peft_model_id)
45
+
46
+ # load into pipeline
47
+ pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
48
+
49
+ # CASE Number 1:
50
+ prompt='What was the first album Beyoncé released as a solo artist?'
51
+ prompt = f"Instruct: generate a SQL query.\n{prompt}\nOutput:\n" # for dataset b-mc2/sql-create-context
52
+ outputs = pipe(prompt, max_new_tokens=1024, do_sample=True, temperature=0.9, top_k=50, top_p=0.1, eos_token_id=pipe.tokenizer.eos_token_id, pad_token_id=pipe.tokenizer.eos_token_id)
53
+
54
+ print('Question: %s'%prompt)
55
+ #print('Answer: %s \nOutput:\n'%outputs[0]['generated_text'])
56
+ print(f"Generated Answer:\n{outputs[0]['generated_text'][len(prompt):].strip()}")
57
+
58
+ Question: Instruct: generate a SQL query.
59
+ What was the first album Beyoncé released as a solo artist?
60
+ Output:
61
+
62
+ Generated Answer:
63
+ SELECT first_album FROM table_name_82 WHERE solo_artist = "beyoncé"
64
+
65
+ # CASE Number 2:
66
+ prompt='What was the first album Beyoncé released as a solo artist?'
67
+ prompt = f"Instruct: Answer the following question.\n{prompt}\nOutput:\n"
68
+ outputs = pipe(prompt, max_new_tokens=1024, do_sample=True, temperature=0.9, top_k=50, top_p=0.1, eos_token_id=pipe.tokenizer.eos_token_id, pad_token_id=pipe.tokenizer.eos_token_id)
69
+
70
+ print('Question: %s'%prompt)
71
+ #print('Answer: %s \nOutput:\n'%outputs[0]['generated_text'])
72
+ print(f"Generated Answer:\n{outputs[0]['generated_text'][len(prompt):].strip()}")
73
+
74
+ Question: Instruct: Answer the following question.
75
+ What was the first album Beyoncé released as a solo artist?
76
+ Output:
77
+
78
+ Generated Answer:
79
+ The first album Beyoncé released as a solo artist was "Dangerously in Love".
80
+
81
  ## Intended uses & limitations
82
 
83
  More information needed