--- language: - en library_name: transformers base_model: meta-llama/Llama-3.2-1B license: apache-2.0 --- ## Model Information We release the HTML pruner model used in **HtmlRAG: HTML is Better Than Plain Text for Modeling Retrieval Results in RAG Systems**.
Useful links: π Paper β’ π€ Hugging Face ⒠𧩠Github
We propose HtmlRAG, which uses HTML instead of plain text as the format of external knowledge in RAG systems. To tackle the long context brought by HTML, we propose **Lossless HTML Cleaning** and **Two-Step Block-Tree-Based HTML Pruning**. - **Lossless HTML Cleaning**: This cleaning process just removes totally irrelevant contents and compress redundant structures, retaining all semantic information in the original HTML. The compressed HTML of lossless HTML cleaning is suitable for RAG systems that have long-context LLMs and are not willing to loss any information before generation. - **Two-Step Block-Tree-Based HTML Pruning**: The block-tree-based HTML pruning consists of two steps, both of which are conducted on the block tree structure. The first pruning step uses a embedding model to calculate scores for blocks, while the second step uses a path generative model. The first step processes the result of lossless HTML cleaning, while the second step processes the result of the first pruning step. πΉ If you use this model, please β¨star our **[GitHub repository](https://github.com/plageon/HTMLRAG)** to support us. Your star means a lot! ## π¦ Installation Install the package using pip: ```bash pip install htmlrag ``` Or install the package from source: ```bash pip install -e . ``` --- ## π User Guide ### π§Ή HTML Cleaning ```python from htmlrag import clean_html question = "When was the bellagio in las vegas built?" html = """The Bellagio is a luxury hotel and casino located on the Las Vegas Strip in Paradise, Nevada. It was built in 1998.
Some other text
Some other text
The Bellagio is a luxury hotel and casino located on the Las Vegas Strip in Paradise, Nevada. It was built in 1998.
#Some other text
#Some other text
#Some other text
#Some other text
#The Bellagio is a luxury hotel and casino located on the Las Vegas Strip in Paradise, Nevada. It was built in 1998.
# Block Path: ['html', 'p'] # Is Leaf: True ``` ### βοΈ Prune HTML Blocks with Embedding Model ```python from htmlrag import EmbedHTMLPruner embed_html_pruner = EmbedHTMLPruner(embed_model="bm25") block_rankings = embed_html_pruner.calculate_block_rankings(question, simplified_html, block_tree) print(block_rankings) # [0, 2, 1] from transformers import AutoTokenizer chat_tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.1-70B-Instruct") max_context_window = 60 pruned_html = embed_html_pruner.prune_HTML(simplified_html, block_tree, block_rankings, chat_tokenizer, max_context_window) print(pruned_html) # #The Bellagio is a luxury hotel and casino located on the Las Vegas Strip in Paradise, Nevada. It was built in 1998.
# ``` ### βοΈ Prune HTML Blocks with Generative Model ```python from htmlrag import GenHTMLPruner import torch ckpt_path = "zstanjj/HTML-Pruner-Llama-1B" if torch.cuda.is_available(): device="cuda" else: device="cpu" gen_embed_pruner = GenHTMLPruner(gen_model=ckpt_path, max_node_words=5, device=device) block_rankings = gen_embed_pruner.calculate_block_rankings(question, pruned_html) print(block_rankings) # [1, 0] block_tree, pruned_html=build_block_tree(pruned_html, max_node_words=10) for block in block_tree: print("Block Content: ", block[0]) print("Block Path: ", block[1]) print("Is Leaf: ", block[2]) print("") # Block Content:The Bellagio is a luxury hotel and casino located on the Las Vegas Strip in Paradise, Nevada. It was built in 1998.
# Block Path: ['html', 'p'] # Is Leaf: True max_context_window = 32 pruned_html = gen_embed_pruner.prune_HTML(pruned_html, block_tree, block_rankings, chat_tokenizer, max_context_window) print(pruned_html) #The Bellagio is a luxury hotel and casino located on the Las Vegas Strip in Paradise, Nevada. It was built in 1998.
``` ## Results - **Results for [HTML-Pruner-Phi-3.8B](https://huggingface.co/zstanjj/HTML-Pruner-Phi-3.8B) and [HTML-Pruner-Llama-1B](https://huggingface.co/zstanjj/HTML-Pruner-Llama-1B) with Llama-3.1-70B-Instruct as chat model**. | Dataset | ASQA | HotpotQA | NQ | TriviaQA | MuSiQue | ELI5 | |------------------|-----------|-----------|-----------|-----------|-----------|-----------| | Metrics | EM | EM | EM | EM | EM | ROUGE-L | | BM25 | 49.50 | 38.25 | 47.00 | 88.00 | 9.50 | 16.15 | | BGE | 68.00 | 41.75 | 59.50 | 93.00 | 12.50 | 16.20 | | E5-Mistral | 63.00 | 36.75 | 59.50 | 90.75 | 11.00 | 16.17 | | LongLLMLingua | 62.50 | 45.00 | 56.75 | 92.50 | 10.25 | 15.84 | | JinaAI Reader | 55.25 | 34.25 | 48.25 | 90.00 | 9.25 | 16.06 | | HtmlRAG-Phi-3.8B | **68.50** | **46.25** | 60.50 | **93.50** | **13.25** | **16.33** | | HtmlRAG-Llama-1B | 66.50 | 45.00 | **60.75** | 93.00 | 10.00 | 16.25 | --- ## π Citation ```bibtex @misc{tan2024htmlraghtmlbetterplain, title={HtmlRAG: HTML is Better Than Plain Text for Modeling Retrieved Knowledge in RAG Systems}, author={Jiejun Tan and Zhicheng Dou and Wen Wang and Mang Wang and Weipeng Chen and Ji-Rong Wen}, year={2024}, eprint={2411.02959}, archivePrefix={arXiv}, primaryClass={cs.IR}, url={https://arxiv.org/abs/2411.02959}, } ```