Spaces:
Sleeping
Sleeping
Update app.py
Browse files
app.py
CHANGED
@@ -15,9 +15,9 @@ def main():
|
|
15 |
[report](https://api.wandb.ai/links/dmeltzer/7an677es) for details on our training procedure.\
|
16 |
\n\nThe two fine-tuned models (BART-Large and FLAN-T5-Base) are hosted on AWS using a combination of AWS Sagemaker, Lambda, and API gateway.\
|
17 |
GPT-3.5 is called using the OpenAI API and the FLAN-T5-XXL model is hosted by HuggingFace and is called with their Inference API.\
|
18 |
-
\n \n **Disclaimer**: When first running this application it may take
|
19 |
You may recieve also an error message when calling the FLAN-T5-XXL model since the Inference API takes around 20 seconds to load the model.\
|
20 |
-
Both issues should disappear on any subsequent calls to the application
|
21 |
|
22 |
AWS_checkpoints = {}
|
23 |
AWS_checkpoints['BART-Large']='https://8hlnvys7bh.execute-api.us-east-1.amazonaws.com/beta/'
|
|
|
15 |
[report](https://api.wandb.ai/links/dmeltzer/7an677es) for details on our training procedure.\
|
16 |
\n\nThe two fine-tuned models (BART-Large and FLAN-T5-Base) are hosted on AWS using a combination of AWS Sagemaker, Lambda, and API gateway.\
|
17 |
GPT-3.5 is called using the OpenAI API and the FLAN-T5-XXL model is hosted by HuggingFace and is called with their Inference API.\
|
18 |
+
\n \n **Disclaimer**: When first running this application it may take approximately 30 seconds for the first two responses to load because of the cold start problem with AWS Lambda.\
|
19 |
You may recieve also an error message when calling the FLAN-T5-XXL model since the Inference API takes around 20 seconds to load the model.\
|
20 |
+
Both issues should disappear on any subsequent calls to the application.")
|
21 |
|
22 |
AWS_checkpoints = {}
|
23 |
AWS_checkpoints['BART-Large']='https://8hlnvys7bh.execute-api.us-east-1.amazonaws.com/beta/'
|