abertsch's picture
Create README.md
f2a7c2d
|
raw
history blame
398 Bytes
metadata
datasets:
  - abertsch/booksum-fullbooks
pipeline_tag: text2text-generation

Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model using the alternating-training strategy described in section 3.2 of the paper. It was finetuned on the dataset BookSum (full-book setting).