title: Web Search Mcp
emoji: 🚀
colorFrom: red
colorTo: red
sdk: gradio
sdk_version: 5.33.0
app_file: app.py
pinned: true
license: apache-2.0
tags:
- mcp-server-track
- hackathon
- mcp
- web-search
- search-engines
- gradio
- python
short_description: A mcp app for searching across multiple search engines
Web Search MCP
A unified Gradio-based web application for searching across multiple search engines using the Model Context Protocol (MCP).
Features
- Tabbed Interface: Search multiple engines from a single UI (Brave, DuckDuckGo, SearxNG, SerpAPI, Serper, Tavily).
- Async & Unified Results: Fast, standardized results in a consistent table format.
- Easy Extension: Add new search engines with minimal code.
- API Key Support: Bring your own API keys avoiding vendor lock-in.
Demo Video
Watch our project demonstration and learn how Web Search MCP works:
About Gradio MCP
Gradio MCP (Model Control Protocol) is a standardized protocol that allows you to expose your Gradio app’s tools and interfaces so they can be called directly by Large Language Models (LLMs) and agentic AI systems. By launching your app with mcp_server=True
, it becomes accessible as an MCP server, letting LLMs and compatible clients programmatically use your functions.
- Gradio MCP uses your function’s docstrings and type hints to automatically generate tool documentation and parameter descriptions.
- The MCP server runs alongside the regular Gradio UI and exposes a standardized endpoint (e.g.,
/gradio_api/mcp/sse
). - Compatible clients (such as Claude Desktop, Cursor, or Cline) can connect to your MCP endpoint for seamless integration.
- This makes it easy to build advanced AI workflows by connecting your Gradio tools to the broader LLM ecosystem.
Project Structure
app.py
— Entry point; builds and launches the Gradio app.app_wrapper.py
— Manages Gradio TabbedInterface and interface registration.register_interfaces.py
— Registers all search engine wrappers as Gradio tabs.search_engines/
— Contains wrappers and async search functions for each engine:brave.py
,duckduckgo.py
,searxng.py
,serpapi.py
,serper.py
,tavily.py
,base.py
utils/helpers.py
— Utility to map and standardize search results.requirements.txt
— Python dependencies..env
— Store API keys and config (not included in repo).
Installation
Clone the repository:
git clone <repo-url> cd web-search-mcp
Install dependencies:
pip install -r requirements.txt
Configure environment:
- Copy
.env.example
to.env
(if available) and fill in required API keys for the engines you wish to use.
- Copy
Usage
Run the application:
python app.py
The Gradio UI will launch, with a tab for each supported search engine. Enter your query and (where required) API key to search.
Adding a New Search Engine
- Create a new wrapper in
search_engines/
, subclassingBaseInterfaceWrapper
. - Implement the async search function and result mapping.
- Register your new wrapper in
register_interfaces.py
.
Contributing
- Follow the structure and code style of existing wrappers.
- Ensure new engines use async functions and standardized result mapping.
- Submit a pull request with a clear description of changes.
Team
This project was created by Team Expensynth for the Gradio Agents Hackathon Track 1:
This project uses Gradio MCP and LangChain for search engine integration. For more info, see their official docs.