metadata
license: apache-2.0
pipeline_tag: feature-extraction
library_name: transformers
Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders
This model is presented in the paper Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders. It's a cross-encoder architecture designed for efficient and permutation-invariant passage re-ranking.
Code: https://github.com/webis-de/set-encoder
We provide the following pre-trained models:
Model Name | TREC DL 19 (BM25) | TREC DL 20 (BM25) | TREC DL 19 (ColBERTv2) | TREC DL 20 (ColBERTv2) |
---|---|---|---|---|
set-encoder-base | 0.724 | 0.710 | 0.788 | 0.777 |
set-encoder-large | 0.727 | 0.735 | 0.789 | 0.790 |
Inference
We recommend using the lightning-ir
cli to run inference. The following command can be used to run inference using the set-encoder-base
model on the TREC DL 19 and TREC DL 20 datasets:
lightning-ir re_rank --config configs/re-rank.yaml --config configs/set-encoder-finetuned.yaml --config configs/trec-dl.yaml
Fine-Tuning
WIP