tite-2-late-msmarco / README.md
fschlatt's picture
Update README.md
a466417 verified
---
license: apache-2.0
pipeline_tag: text-ranking
library_name: lightning-ir
base_model:
- webis/tite-2-late
tags:
- bi-encoder
---
# TITE: Token-Independent Text Encoder
This model is presented in the paper [TITE: Token-Independent Text Encoder for Information Retrieval](https://dl.acm.org/doi/10.1145/3726302.3730094). It's an efficient bi-encoder model for creating embeddings for queries and documents.
We provide the following pre-trained models encoder models:
- [webis/tite-2-late](https://huggingface.co/webis/tite-2-late)
- [webis/tite-2-late-upscale](https://huggingface.co/webis/tite-2-late-upscale)
We provide the following fine-tuned bi-encoder models for text ranking:
| Model | TREC DL 19 | TREC DL 20 | BEIR (geometric mean) |
|-------|------------|------------|-----------------------|
| [`webis/tite-2-late-msmarco`](https://huggingface.co/webis/tite-2-late-msmarco) | 0.69 | 0.71 | 0.40 |
| [`webis/tite-2-late-upscale-msmarco`](https://huggingface.co/webis/tite-2-late-upscale-msmarco) | 0.68 | 0.71 | 0.41 |
## Usage
See the [repository](https://github.com/webis-de/tite) for more information on how to use the or reproduce the model.
## Citation
If you use this code or the models in your research, please cite our paper:
```bibtex
@InProceedings{schlatt:2025,
author = {Ferdinand Schlatt and Tim Hagen and Martin Potthast and Matthias Hagen},
booktitle = {48th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2025)},
doi = {10.1145/3726302.3730094},
month = jul,
pages = {2493--2503},
publisher = {ACM},
site = {Padua, Italy},
title = {{TITE: Token-Independent Text Encoder for Information Retrieval}},
year = 2025
}
```