LLMSearchEngine / README.md
codelion's picture
Update README.md
632def0 verified
|
raw
history blame
1.89 kB
metadata
title: LLMSearchEngine
emoji: 🏆
colorFrom: gray
colorTo: purple
sdk: docker
app_file: app.py
pinned: false

LLM Search Engine

This is a Flask-based web application that uses a large language model (LLM) to generate search engine-like results, styled to resemble Google’s classic search results page. Instead of querying an external search API, it prompts an LLM to create titles, snippets, and URLs for a given query, delivering a paginated, familiar interface.

Why We Built It

We created this app to explore how LLMs can mimic traditional search engines by generating results directly from their training data. It offers:

  • A nostalgic, Google-like pagination design with clickable links.
  • A proof-of-concept for LLM-driven search without real-time web access.
  • A simple, self-contained alternative for queries within the model’s knowledge base.

Features

  • Google-Styled Interface: Search bar, result list, and pagination styled with Google’s colors and layout.
  • Generated Results: Titles, snippets, and URLs are fully produced by the LLM.
  • Pagination: Displays 10 results per page, up to 30 total results across 3 pages.

Limitations

  • Static Knowledge: Results are limited to the LLM’s training cutoff (e.g., pre-2025).
  • Generated Content: URLs and snippets may not correspond to real web pages—use as a starting point.
  • No Real-Time Data: Best for historical or established topics, not breaking news.

Using It on Hugging Face Spaces

Try the Demo

Deployed on Hugging Face Spaces, you can test it at https://codelion-llmsearchengine.hf.space:

  1. Open the URL in your browser.
  2. Type a query (e.g., "best Python libraries") in the search bar and press Enter or click "LLM Search".
  3. Browse the paginated results, styled like Google, using "Previous" and "Next" links.