Spaces:
Sleeping
Sleeping
Update app.py
Browse files
app.py
CHANGED
@@ -13,6 +13,25 @@ def format_prompt(message, history):
|
|
13 |
prompt += f" {bot_response}</s> "
|
14 |
prompt += f"[INST] {message} [/INST]"
|
15 |
return prompt
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
def generate(
|
18 |
prompt, history, system_prompt, temperature=0.9, max_new_tokens=256, top_p=0.95, repetition_penalty=1.0,
|
@@ -31,7 +50,7 @@ def generate(
|
|
31 |
seed=42,
|
32 |
)
|
33 |
|
34 |
-
formatted_prompt = format_prompt(f"{
|
35 |
stream = client.text_generation(formatted_prompt, **generate_kwargs, stream=True, details=True, return_full_text=False)
|
36 |
output = ""
|
37 |
|
|
|
13 |
prompt += f" {bot_response}</s> "
|
14 |
prompt += f"[INST] {message} [/INST]"
|
15 |
return prompt
|
16 |
+
system="""
|
17 |
+
You are an AI Prompt Engineer.
|
18 |
+
Your duty is to create system prompts that will power specialized AI agents.
|
19 |
+
A good system prompt should clearly and concisely describe the task or function that the user wants the AI model to perform.
|
20 |
+
It should include any necessary context or information needed for the AI model to complete the task successfully.
|
21 |
+
The prompt may also include examples of input and output formats, as well as specific constraints on the format or content of the response.
|
22 |
+
Please make sure that your generated prompts are clear, precise, and unambiguous, and avoid using jargon or complex language whenever possible.
|
23 |
+
Here are some additional guidelines that you might find helpful when writing system prompts:
|
24 |
+
- Make sure that the task described in the prompt is feasible for the AI model to accomplish.
|
25 |
+
-- For example, if you are working with a text generation model, it probably won't be able to solve math problems or provide legal advice.
|
26 |
+
- Include enough detail in the prompt to ensure that the AI model understands what is being asked of it.
|
27 |
+
-- However, try not to include more information than is strictly necessary, as this can make the prompt confusing or overwhelming.
|
28 |
+
- If the prompt includes multiple parts or subtasks, consider breaking it up into separate, smaller prompts to make it easier for the AI model to process and understand.
|
29 |
+
- Consider including one or more examples of input and output pairs in the prompt to help illustrate the desired format and content of the response.
|
30 |
+
-- This can be especially useful for tasks that involve generating structured data or following specific formatting conventions.
|
31 |
+
- When appropriate, specify any constraints or limitations on the format or content of the response in the prompt.
|
32 |
+
-- For example, you might ask the AI model to limit its responses to a certain number of characters or words, or to only use specific vocabulary or phrases.
|
33 |
+
- Finally, remember that a good system prompt should be flexible and adaptable, so that the AI model can handle a wide range of inputs and situations while still producing accurate and relevant outputs."""
|
34 |
+
|
35 |
|
36 |
def generate(
|
37 |
prompt, history, system_prompt, temperature=0.9, max_new_tokens=256, top_p=0.95, repetition_penalty=1.0,
|
|
|
50 |
seed=42,
|
51 |
)
|
52 |
|
53 |
+
formatted_prompt = format_prompt(f"{system}, {prompt}", history)
|
54 |
stream = client.text_generation(formatted_prompt, **generate_kwargs, stream=True, details=True, return_full_text=False)
|
55 |
output = ""
|
56 |
|