File size: 662 Bytes
75f4691 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
## Usage:
```
from transformers import pipeline
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("SummerSigh/Pythia410m-Instruct-SFT")
generator = pipeline('text-generation', model = 'SummerSigh/Pythia410m-Instruct-SFT')
inpopo = input("Text here: ")
text = generator("<user>" + inpopo + "<user><kinrel>" , max_length = 200, do_sample=True, top_p = 0.7, temperature = 0.5, repetition_penalty = 1.2, pad_token_id=tokenizer.eos_token_id)
generated_text = text[0]["generated_text"]
parts = generated_text.split("<kinrel>")
cropped_text = "<kinrel>".join(parts[:2]) + "<kinrel>"
print(cropped_text)
``` |