|
--- |
|
license: apache-2.0 |
|
metrics: |
|
- bertscore |
|
- rouge |
|
- bleu |
|
pipeline_tag: text-generation |
|
language: |
|
- en |
|
library_name: transformers |
|
tags: |
|
- Story-Generation |
|
- State-Space |
|
- text-generation-inference |
|
- story-writing |
|
--- |
|
|
|
The State-Space/Mamba-370M is finetuned on ROC Stories dataset to be able to generate endings to short stories cohesively. |
|
|
|
The Evaluation metrics on the ROC stories dataset for story ending generation are: |
|
|
|
Bert (f1) : 0.878 |
|
|
|
Meteor: 0.1 |
|
|
|
bleu : 0.0125 |
|
|
|
Rouge1: 0.18 |
|
|
|
Perplexity : 207 |
|
|
|
|
|
### To use the Model: |
|
|
|
```python |
|
>>> from transformers import MambaForCausalLM, AutoTokenizer |
|
>>> model_name = "DdIiVvYyAaMm/mamba-370m-story-generation" |
|
|
|
>>> tokenizer = AutoTokenizer.from_pretrained(model_name) |
|
>>> model = MambaForCausalLM.from_pretrained(model_name) |
|
|
|
# And the rest of code standard as from transformers library. |
|
``` |
|
|