Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders

This model is presented in the paper Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders. It's a cross-encoder architecture designed for efficient and permutation-invariant passage re-ranking.

Code: https://github.com/webis-de/set-encoder

We provide the following pre-trained models:

Model Name TREC DL 19 (BM25) TREC DL 20 (BM25) TREC DL 19 (ColBERTv2) TREC DL 20 (ColBERTv2)
set-encoder-base 0.724 0.710 0.788 0.777
set-encoder-large 0.727 0.735 0.789 0.790

Inference

We recommend using the lightning-ir cli to run inference. The following command can be used to run inference using the set-encoder-base model on the TREC DL 19 and TREC DL 20 datasets:

lightning-ir re_rank --config configs/re-rank.yaml --config configs/set-encoder-finetuned.yaml --config configs/trec-dl.yaml

Fine-Tuning

WIP

Downloads last month
15
Safetensors
Model size
334M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for webis/set-encoder-large

Finetuned
(4)
this model