ArturBaranowskiAA's picture
Update README.md
23e67c2 verified
|
raw
history blame
939 Bytes
metadata
license: other
license_name: open-aleph-license
license_link: LICENSE
library_name: transformers
pipeline_tag: text-generation

This is the safetensors-conversion of Pharia-1-LLM-7B-control. We provide a joint model card for Pharia-1-LLM-7B-control and Pharia-1-LLM-control-aligned. Find this model card here.

Usage

from transformers import AutoModelForCausalLM, PreTrainedTokenizerFast

INPUT = "Hello, how are you"
MODEL_ID = "Aleph-Alpha/Pharia-1-LLM-7B-control-safetensors"

tokenizer = PreTrainedTokenizerFast.from_pretrained(MODEL_ID)
model = AutoModelForCausalLM.from_pretrained(MODEL_ID, trust_remote_code=True)

inputs = tokenizer(INPUT, return_token_type_ids=False, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)

generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)