Steve Huguenin-Elie commited on
Commit
2805de5
·
1 Parent(s): e880564

Customize chat for the multi-domain customer chat bot

Browse files
Files changed (1) hide show
  1. app.py +8 -10
app.py CHANGED
@@ -14,19 +14,17 @@ MAX_INPUT_TOKEN_LENGTH = int(os.getenv("MAX_INPUT_TOKEN_LENGTH", "4096"))
14
  DESCRIPTION = """\
15
  # Llama-2 7B Chat
16
 
17
- This Space demonstrates model [Llama-2-7b-chat](https://huggingface.co/meta-llama/Llama-2-7b-chat) by Meta, a Llama 2 model with 7B parameters fine-tuned for chat instructions. Feel free to play with it, or duplicate to run generations without a queue! If you want to run your own service, you can also [deploy the model on Inference Endpoints](https://huggingface.co/inference-endpoints).
18
 
19
  🔎 For more details about the Llama 2 family of models and how to use them with `transformers`, take a look [at our blog post](https://huggingface.co/blog/llama2).
20
-
21
- 🔨 Looking for an even more powerful model? Check out the [13B version](https://huggingface.co/spaces/huggingface-projects/llama-2-13b-chat) or the large [70B model demo](https://huggingface.co/spaces/ysharma/Explore_llamav2_with_TGI).
22
  """
23
 
24
  LICENSE = """
25
  <p/>
26
 
27
  ---
28
- As a derivate work of [Llama-2-7b-chat](https://huggingface.co/meta-llama/Llama-2-7b-chat) by Meta,
29
- this demo is governed by the original [license](https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/LICENSE.txt) and [acceptable use policy](https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/USE_POLICY.md).
30
  """
31
 
32
  if not torch.cuda.is_available():
@@ -127,11 +125,11 @@ chat_interface = gr.ChatInterface(
127
  ],
128
  stop_btn=None,
129
  examples=[
130
- ["Hello there! How are you doing?"],
131
- ["Can you explain briefly to me what is the Python programming language?"],
132
- ["Explain the plot of Cinderella in a sentence."],
133
- ["How many hours does it take a man to eat a Helicopter?"],
134
- ["Write a 100-word article on 'Benefits of Open-Source in AI research'"],
135
  ],
136
  )
137
 
 
14
  DESCRIPTION = """\
15
  # Llama-2 7B Chat
16
 
17
+ This Space demonstrates model [llama-2-7b-bics-multi_woz_v22](https://huggingface.co/stevugnin/llama-2-7b-bics-multi_woz_v22) by University of Luxembourg FSTM, a Llama 2 model with 7B parameters fine-tuned for multi-domain customer support chat instructions. Feel free to play with it, or duplicate to run generations without a queue! If you want to run your own service, you can also [deploy the model on Inference Endpoints](https://huggingface.co/inference-endpoints).
18
 
19
  🔎 For more details about the Llama 2 family of models and how to use them with `transformers`, take a look [at our blog post](https://huggingface.co/blog/llama2).
 
 
20
  """
21
 
22
  LICENSE = """
23
  <p/>
24
 
25
  ---
26
+ As a derivate work of [llama-2-7b-bics-multi_woz_v22](https://huggingface.co/stevugnin/llama-2-7b-bics-multi_woz_v22) by University of Luxembourg FSTM,
27
+ this demo is governed by the original [license](https://huggingface.co/spaces/stevugnin/multi-domain-customer-support-chat/blob/main/LICENSE.txt) and [acceptable use policy](https://huggingface.co/spaces/stevugnin/multi-domain-customer-support-chat/blob/main/USE_POLICY.md).
28
  """
29
 
30
  if not torch.cuda.is_available():
 
125
  ],
126
  stop_btn=None,
127
  examples=[
128
+ ["Hi there! Can you give me some info on Cityroomz?"],
129
+ ["I am looking for a restaurant. I would like something cheap that has Chinese food."],
130
+ ["Please find me a train from Cambridge to Stansted airport."],
131
+ ["I'd like a train from Leicester to Cambridge, please!"],
132
+ ["Hi, I'm traveling to Cambridge soon and am looking forward to seeing some local tourist attractions."],
133
  ],
134
  )
135