Kelechi Osuji commited on
Commit
e861aef
·
1 Parent(s): a7cd53f

Define the prompt template for both general conversation and weather retrieval

Browse files
Files changed (1) hide show
  1. workflow.py +20 -4
workflow.py CHANGED
@@ -11,12 +11,28 @@ def get_session_history(memory: BaseMemory):
11
  def get_workflow():
12
  """Set up the chatbot workflow with memory and prompt template."""
13
 
14
- # Define the prompt template for conversation
15
  prompt_template = PromptTemplate(
16
- input_variables=["input"],
17
- template="You are a helpful assistant. The user just said: {input}. What would be a good response?"
18
- )
 
 
 
 
 
 
 
 
 
 
 
 
19
 
 
 
 
 
20
  # Create memory for conversation
21
  memory = ConversationBufferMemory(memory_key="input", return_messages=True)
22
 
 
11
  def get_workflow():
12
  """Set up the chatbot workflow with memory and prompt template."""
13
 
14
+ # Define the prompt template for both general conversation and weather retrieval
15
  prompt_template = PromptTemplate(
16
+ input_variables=["input", "previous_conversation"],
17
+ template="""
18
+ You are a helpful assistant. You should answer the user's question or have a normal conversation. If the user asks about the weather,
19
+ please respond with the current weather information based on their input location. Otherwise, answer to the best of your ability.
20
+
21
+ If the user's input is about the weather, you should respond with details about the weather.
22
+ For example:
23
+ - "What is the weather in Paris?"
24
+ - "How's the weather in New York?"
25
+
26
+ Example conversation flow:
27
+ User: What's the weather like today in London?
28
+ Assistant: Let me check the weather for you. The current weather in London is [weather details].
29
+
30
+ If the input is not weather-related, just respond with a conversational response.
31
 
32
+ The user said: {input}
33
+ Previous conversation: {previous_conversation}
34
+ """)
35
+
36
  # Create memory for conversation
37
  memory = ConversationBufferMemory(memory_key="input", return_messages=True)
38