nickprock/setfit-banking77

This is a SetFit model that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Train Hyperparameters

  • Simulate the few-shot regime by sampling 25 examples per class
  • Sentence Transformer checkpoint: "sentence-transformers/paraphrase-distilroberta-base-v2"
  • Number of text pairs to generate for contrastive learning: 10
  • Epochs: 1
  • Batch size: 32

Metrics on Evaluation set

  • accuracy score: 0.8529
  • f1 score: 0.8527

Usage

To use this model for inference, first install the SetFit library:

python -m pip install setfit

You can then run inference as follows:

from setfit import SetFitModel

# Download from Hub and run inference
model = SetFitModel.from_pretrained("nickprock/setfit-banking77")
# Run inference
preds = model(["I can't pay by my credit card"])

BibTeX entry and citation info

@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
14
Safetensors
Model size
82.1M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train nickprock/setfit-banking77

Collection including nickprock/setfit-banking77