Sparse BERT mini model (uncased)

Finetuned model pruned to 1:4 structured sparsity. The model is a pruned version of the BERT mini model.

Intended Use

The model can be used for inference with sparsity optimization. For further details on the model and its usage will be soon available.

Evaluation Results

We get the following results on the sst2 tasks development set:

Task SST-2 (Acc)
87.2
Better than dense bert mini which is 84.74%.
Downloads last month
38
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including Intel/bert-mini-sst2-distilled-sparse-90-1X4-block