File size: 939 Bytes
e330373 23e67c2 e330373 23e67c2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
---
license: other
license_name: open-aleph-license
license_link: LICENSE
library_name: transformers
pipeline_tag: text-generation
---
This is the safetensors-conversion of `Pharia-1-LLM-7B-control`.
We provide a joint model card for `Pharia-1-LLM-7B-control` and `Pharia-1-LLM-control-aligned`. Find this model card [here](https://huggingface.co/Aleph-Alpha/Pharia-1-LLM-7B-control).
# Usage
```python
from transformers import AutoModelForCausalLM, PreTrainedTokenizerFast
INPUT = "Hello, how are you"
MODEL_ID = "Aleph-Alpha/Pharia-1-LLM-7B-control-safetensors"
tokenizer = PreTrainedTokenizerFast.from_pretrained(MODEL_ID)
model = AutoModelForCausalLM.from_pretrained(MODEL_ID, trust_remote_code=True)
inputs = tokenizer(INPUT, return_token_type_ids=False, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)
``` |