Spaces:
Runtime error
Runtime error
File size: 3,505 Bytes
d54044f c030f81 a1a07b5 c030f81 a1a07b5 c030f81 a1a07b5 c030f81 507cfea 03a963c 507cfea c030f81 507cfea c030f81 03a963c c030f81 e04a518 c030f81 e04a518 c030f81 b333741 e87eb12 ac8c397 e87eb12 ac8c397 2baab67 ac8c397 c030f81 b696bcd ac8c397 c030f81 adc96ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 |
#from transformers import AutoModelForCausalLM, AutoTokenizer
from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain
from langchain.chains.conversation.memory import ConversationBufferWindowMemory
from langchain.prompts import PromptTemplate
import gradio as gr
#REPO_ID = "Xenova/gpt-3.5-turbo"
# Load the model and tokenizer from Hugging Face's model hub
#model = AutoModelForCausalLM.from_pretrained(REPO_ID)
#tokenizer = AutoTokenizer.from_pretrained(REPO_ID)
#llm = ChatOpenAI(model=model, tokenizer=tokenizer)
llm = ChatOpenAI(model_name="gpt-3.5-turbo", openai_api_key="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")
# Initialize global variable for conversation memory
buffer_memory = ConversationBufferWindowMemory(k=3)
#buffer_memory = ConversationBufferWindowMemory(k=3)
conversation = ConversationChain(
llm=llm,
memory=buffer_memory,
verbose=True
)
context = """
You act as a chatbot to interact with users on their questions about traditional chinese medicine (TCM). \
Welcome the user in a friendly way. \
You respond in a short, very friendly style. \
For each text, mark NER tags. \
Tag categories: location, product. \
Text: I want to visit a clinic in [Ang Mo Kio](location). \
If a user says he/she wants to visit a traditional chinese medicine (TCM) clinic, \
ask for which location he/she wants to visit. \
After user replies with location, show in chat window the Google Map from this link \
https://www.google.com/maps/search/tcm+clinics+at+location+name \
For example, if a user wants to visit a clinic in Ang Mo Kio, \
show in chat window the Google Map from this link \
https://www.google.com/maps/search/tcm+clinics+at+ang+mo+kio \
Examoples of location names:
North:
Sembawang
Woodlands
Yishun
North-East:
Ang Mo Kio
Hougang
Punggol
Sengkang
Serangoon
East:
Bedok
Pasir Ris
Tampines
West:
Bukit Batok
Bukit Panjang
Choa Chu Kang
Clementi
Jurong East
Jurong West
Tengah
Central:
Bishan
Bukit Merah
Bukit Timah
Central Area
Geylang
Kallang
Whampoa
Marine Parade
Queenstown
Toa Payoh
For each text, mark NER tags. \
Tag categories: location, product. \
Text: I want to buy/get [Po Chai Pills](product). \
If a user wants to buy/get a product, suggest that \
he/she can consider buying/getting from https://www.amazon.sg/s?k=product+name \
For example, if a user wants to buy Po Chai Pills, suggest \
he/she can consider buying/getting from https://www.amazon.sg/s?k=po+chai+pills \
Examples of product names:
Ointment/Hong You/Feng You/Fengyou
Liquorice/Gan cao/Gancao
Chrysanthemum/Ju hua/Juhua
Goji berry/wolfberry/Gou Qi Zi/Gouqizi
Red dates/Jujubes/Hong Zao/Hongzao
"""
prompt_template = PromptTemplate.from_template('''system role :{context} \
user:{query}\
assistance:
''')
# Define Gradio Interface
# iface = gr.Interface(
# fn=lambda query: conversation.run(prompt_template.format(context=context, query=query)),
# inputs=gr.Textbox(),
# outputs=gr.Textbox(),
# live=True,
# )
# Create a function to handle the Gradio Interface
def chat_interface(query, chat_history=[]):
response, chat_history = conversation.run(prompt_template.format(context=context, query=query), chat_history)
return response, chat_history + [query]
# Create the Gradio Interface
iface = gr.Interface(
fn=chat_interface,
inputs=gr.Textbox(),
outputs=gr.Textbox(),
)
# Launch Gradio Interface
iface.launch()
# gr.load("models/ksh-nyp/llama-2-7b-chat-TCMKB").launch() |