Model Overview

It is a TT-compressed model of pretrained bart-base XSUM model. Model was TTM-compressed with additional finetuning up to 62% (64 mln params) of original size.

  • eval_loss = 2.5863
  • eval_rouge1 = 34.5695
  • eval_rouge2 = 12.6411
  • eval_rougeL = 27.3062
  • eval_rougeLsum = 27.2974

How to use

  tokenizer = AutoTokenizer.from_pretrained("facebook/bart-base")
  model = AutoModelForSequenceClassification.from_pretrained("s-nlp/bart-base-xsum-ttd", trust_remote_code=True)
Downloads last month
13
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.