Instructions to use webis/colbert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Lightning IR
How to use webis/colbert with Lightning IR:
#install from https://github.com/webis-de/lightning-ir from lightning_ir import BiEncoderModule model = BiEncoderModule("webis/colbert") model.score("query", ["doc1", "doc2", "doc3"]) - Notebooks
- Google Colab
- Kaggle
Lightning IR ColBERT
This model is a ColBERT[^1] model fine-tuned using Lightning IR.
See the Lightning IR Model Zoo for a comparison with other models.
Reproduction
To reproduce the model training, install Lightning IR and run the following command using the fine-tune.yaml configuration file:
lightning-ir fit --config fine-tune.yaml
To index MS~MARCO passages, use the following command and the index.yaml configuration file:
lightning-ir index --config index.yaml
After indexing, to evaluate the model on TREC Deep Learning 2019 and 2020, use the following command and the search.yaml configuration file:
lightning-ir search --config search.yaml
[^1]: Khattab and Zaharia, ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT
- Downloads last month
- 12
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for webis/colbert
Base model
google-bert/bert-base-uncased