Edit model card

Adapter distilbert-base-uncased_sentiment_sst-2_pfeiffer for distilbert-base-uncased

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the SST-2 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

This adapter was created for usage with the Adapters library.

Usage

First, install adapters:

pip install -U adapters

Now, the adapter can be loaded and activated like this:

from adapters import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("distilbert-base-uncased")
adapter_name = model.load_adapter("AdapterHub/distilbert-base-uncased_sentiment_sst-2_pfeiffer")
model.set_active_adapters(adapter_name)

Architecture & Training

  • Adapter architecture: pfeiffer
  • Prediction head: classification
  • Dataset: SST-2

Author Information

Citation


This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/distilbert-base-uncased_sentiment_sst-2_pfeiffer.yaml.

Downloads last month
31
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.