Update README.md
Browse files
README.md
CHANGED
|
@@ -429,7 +429,7 @@ model-index:
|
|
| 429 |
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
| 430 |
</a>
|
| 431 |
|
| 432 |
-
The Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization is a model I
|
| 433 |
|
| 434 |
## Key Features and Use Cases
|
| 435 |
|
|
@@ -438,7 +438,7 @@ The Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization
|
|
| 438 |
- High capacity: Handles up to 16,384 tokens per batch.
|
| 439 |
- demos: try it out in the notebook linked above or in the [demo on Spaces](https://huggingface.co/spaces/pszemraj/summarize-long-text)
|
| 440 |
|
| 441 |
-
> **Note:** The API
|
| 442 |
|
| 443 |
## Training Details
|
| 444 |
|
|
@@ -448,7 +448,7 @@ Model checkpoint: [`pszemraj/led-base-16384-finetuned-booksum`](https://huggingf
|
|
| 448 |
|
| 449 |
## Other Related Checkpoints
|
| 450 |
|
| 451 |
-
|
| 452 |
|
| 453 |
- [Long-T5-tglobal-base](https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary)
|
| 454 |
- [BigBird-Pegasus-Large-K](https://huggingface.co/pszemraj/bigbird-pegasus-large-K-booksum)
|
|
@@ -524,6 +524,6 @@ print(f"summary: {out_str}")
|
|
| 524 |
|
| 525 |
Currently implemented interfaces include a Python API, a Command-Line Interface (CLI), and a shareable demo/web UI.
|
| 526 |
|
| 527 |
-
For detailed explanations and documentation, check the [README](https://github.com/pszemraj/textsum) or the [wiki](https://github.com/pszemraj/textsum/wiki
|
| 528 |
|
| 529 |
---
|
|
|
|
| 429 |
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
| 430 |
</a>
|
| 431 |
|
| 432 |
+
The Longformer Encoder-Decoder (LED) for Narrative-Esque Long Text Summarization is a model I fine-tuned, designed to condense extensive technical, academic, and narrative content in a fairly generalizable banner.
|
| 433 |
|
| 434 |
## Key Features and Use Cases
|
| 435 |
|
|
|
|
| 438 |
- High capacity: Handles up to 16,384 tokens per batch.
|
| 439 |
- demos: try it out in the notebook linked above or in the [demo on Spaces](https://huggingface.co/spaces/pszemraj/summarize-long-text)
|
| 440 |
|
| 441 |
+
> **Note:** The API widget has a max length of ~96 tokens due to inference timeout constraints.
|
| 442 |
|
| 443 |
## Training Details
|
| 444 |
|
|
|
|
| 448 |
|
| 449 |
## Other Related Checkpoints
|
| 450 |
|
| 451 |
+
This model is the smallest/fastest booksum-tuned model I have worked on. If you're looking for higher quality summaries, check out:
|
| 452 |
|
| 453 |
- [Long-T5-tglobal-base](https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary)
|
| 454 |
- [BigBird-Pegasus-Large-K](https://huggingface.co/pszemraj/bigbird-pegasus-large-K-booksum)
|
|
|
|
| 524 |
|
| 525 |
Currently implemented interfaces include a Python API, a Command-Line Interface (CLI), and a shareable demo/web UI.
|
| 526 |
|
| 527 |
+
For detailed explanations and documentation, check the [README](https://github.com/pszemraj/textsum) or the [wiki](https://github.com/pszemraj/textsum/wiki)
|
| 528 |
|
| 529 |
---
|