File size: 6,310 Bytes
b73cea4 dde64f8 b73cea4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 |
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 10. Building a Demo with Gradio and Hugging Face Spaces\n",
"\n",
"Now that we've built a powerful LLM-based classifier, let's showcase it to the world (or your colleagues) by creating an interactive demo. In this chapter, we'll learn how to:\n",
"\n",
"1. Create a user-friendly web interface using Gradio\n",
"2. Package our demo for deployment\n",
"3. Deploy it on Hugging Face Spaces for free\n",
"4. Use the Hugging Face Inference API for model access"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### What we will do is the following:\n",
"\n",
"We will essentially start from [the functional notebook](ch9-improving-prompts.ipynb) we created in Chapter 9, and add an interactive component to it.\n",
"\n",
"1. **Add Gradio**\n",
"\n",
"Gradio is a Python library that allows you to easily create web-based interfaces where users can interact with your model. We will install **Gradio** to set up the interface for our model (it will be included in the requirements file — more on that below).\n",
"\n",
" ```python\n",
" import gradio as gr\n",
" ```\n",
"\n",
"2. **Add an interface function that will call what we already coded**\n",
"\n",
"Here we will define the interface function that connects Gradio to the model we built earlier. This function will take input from the user, process it with the classifier, and return the result.\n",
"\n",
"```python\n",
" # -- Gradio interface function --\n",
" def classify_business_names(input_text):\n",
" # Parse input text into list of names\n",
" name_list = [line.strip() for line in input_text.splitlines() if line.strip()]\n",
" \n",
" if not name_list:\n",
" return json.dumps({\"error\": \"No business names provided. Please enter at least one business name.\"})\n",
" \n",
" try:\n",
" result = classify_payees(name_list)\n",
" return json.dumps(result, indent=2)\n",
" except Exception as e:\n",
" return json.dumps({\"error\": f\"Classification failed: {str(e)}\"})\n",
" ```\n",
"\n",
"3. **Launch the Gradio interface**\n",
" \n",
" ```python\n",
" # -- Launch the demo --\n",
" demo = gr.Interface(\n",
" fn=classify_business_names,\n",
" inputs=gr.Textbox(lines=10, placeholder=\"Enter business names, one per line\"),\n",
" outputs=\"json\",\n",
" title=\"Business Category Classifier\",\n",
" description=\"Enter business names and get a classification: Restaurant, Bar, Hotel, or Other.\"\n",
" )\n",
"\n",
" demo.launch(share=True)\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 🌍 Publish your demo to Hugging Face Spaces\n",
"\n",
"To share your Gradio app with the world, you can deploy it to [Hugging Face Spaces](https://huggingface.co/spaces) in just a few steps.\n",
"\n",
"### 1. Prepare your files\n",
"\n",
"Make sure your project has:\n",
"- A `app.py` file containing your Gradio app (e.g. `gr.Interface(...)`)\n",
"- The `sample.csv` file for the few shots classification\n",
"- A `requirements.txt` file listing any Python dependencies:\n",
"```\n",
"gradio\n",
"huggingface_hub\n",
"pandas\n",
"scikit-learn\n",
"retry\n",
"rich\n",
"```\n",
"\n",
"> **Example files are ready to use in the [gradio-app](https://huggingface.co/spaces/JournalistsonHF/first-llm-classifier/tree/main/notebooks/gradio-app) folder!**\n",
"\n",
"### 2. Create a new Space\n",
"\n",
"1. Go to [huggingface.co/spaces](https://huggingface.co/spaces)\n",
"2. Click **\"Create new Space\"**\n",
"3. Choose:\n",
" - **SDK**: Gradio\n",
" - **License**: (choose one, e.g. MIT)\n",
" - **Visibility**: Public or Private\n",
"4. Name your Space and click **Create Space**\n",
"\n",
"### 3. Upload your files\n",
"\n",
"You can:\n",
"- Use the web interface to upload `app.py`, `sample.csv` and `requirements.txt`, or\n",
"- Clone the Space repo with Git and push your files:\n",
"```bash\n",
"git lfs install\n",
"git clone https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME\n",
"cd YOUR_SPACE_NAME\n",
"# Add your files here\n",
"git add .\n",
"git commit -m \"Initial commit\"\n",
"git push\n",
"```\n",
"### 4. Add your Hugging Face token to Secrets\n",
"\n",
"For your Gradio app to interact with Hugging Face’s Inference API (or any other Hugging Face service), you need to securely store your Hugging Face token.\n",
"\n",
"1. In your Hugging Face Space:\n",
" - Navigate to the **Settings** of your Space.\n",
" - Go to the **Secrets** tab.\n",
" - Add your token as a new secret with the key `HF_TOKEN`.\n",
" - **Key**: `HF_TOKEN`\n",
" - **Value**: Your Hugging Face token, which you can get from [here](https://huggingface.co/settings/tokens).\n",
"\n",
"Once added, the token will be accessible in your Space, and you can securely reference it in your code with:\n",
"\n",
"```python\n",
"api_key = os.getenv(\"HF_TOKEN\")\n",
"client = InferenceClient(token=api_key)\n",
"```\n",
"\n",
"### 5. Done 🎉\n",
"\n",
"Your app will build and be available at:\n",
"```\n",
"https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME\n",
"```\n",
"\n",
"Need inspiration? Check out [awesome Spaces](https://huggingface.co/spaces?sort=trending)!\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.5"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
|