|
--- |
|
license: apache-2.0 |
|
language: |
|
- ko |
|
pipeline_tag: text-generation |
|
tags: |
|
- SOLAR |
|
- SOLAR-10.7B |
|
--- |
|
|
|
### BaseModel |
|
- [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0) |
|
|
|
### Model Generation |
|
|
|
``` |
|
from transforemrs import AutoTokenizer, AutoModelForCausalLM |
|
|
|
model = AutoModelForCausalLM.from_pretrained("AIdenU/SOLAR-10.7b-ko-Y24_v1.0", device_map="auto", torch_dtype=torch.float16) |
|
tokenizer = AutoTokenizer.from_pretrained("AIdenU/SOLAR-10.7b-ko-Y24_v1.0", use_fast=True) |
|
|
|
prompt = [ |
|
{'role': 'system', 'content': '๋น์ ์ ์ง์๋ฅผ ๋งค์ฐ ์ ๋ฐ๋ฅด๋ ์ธ๊ณต์ง๋ฅ ๋น์์
๋๋ค.'}, |
|
{'role': 'user', 'content': '์ง๋ ์ด๋ ๋ฐ์ผ๋ฉด ๊ฟํํ๋์?'} |
|
] |
|
outputs = model.generate( |
|
**tokenizer( |
|
tokenizer.apply_chat_template(prompt, tokenize=False, add_generation_prompt=True), |
|
return_tensors='pt' |
|
).to('cuda'), |
|
max_new_tokens=256, |
|
temperature=0.2, |
|
top_p=1, |
|
do_sample=True |
|
) |
|
print(tokenizer.decode(outputs[0])) |
|
``` |