--- license: apache-2.0 language: - ko pipeline_tag: text-generation tags: - SOLAR - SOLAR-10.7B --- ### BaseModel - [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0) ### Model Generation ``` from transforemrs import AutoTokenizer, AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("AIdenU/SOLAR-10.7b-ko-Y24_v0.1", device_map="auto", torch_dtype=torch.float16) tokenizer = AutoTokenizer.from_pretrained("AIdenU/SOLAR-10.7b-ko-Y24_v0.1", use_fast=True) prompt = [ {'role': 'system', 'content': '당신은 지시를 매우 잘 따르는 인공지능 비서입니다.'}, {'role': 'user', 'content': '지렁이도 밟으면 꿈틀하나요?'} ] outputs = model.generate( **tokenizer( tokenizer.apply_chat_template(prompt, tokenize=False, add_generation_prompt=True), return_tensors='pt' ).to('cuda'), max_new_tokens=256, temperature=0.2, top_p=1, do_sample=True ) print(tokenizer.decode(outputs[0])) ```