File size: 1,868 Bytes
8b3f690
 
7841296
 
 
 
8b3f690
 
7841296
8b3f690
7841296
8b3f690
7841296
8b3f690
7841296
 
 
8b3f690
f111a7f
 
 
 
 
 
57e0bba
 
8b3f690
f111a7f
 
 
7841296
 
 
 
 
 
 
 
 
 
 
 
 
8b3f690
7841296
 
8b3f690
7841296
 
8b3f690
7841296
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
---
library_name: transformers
license: mit
language:
- ja
- en
---

# stockmark/stockmark-100b-instruct-v0.1

Stockmark-100b-instruct-v0.1 is an instruction tuned version of [stockmark-100b](https://huggingface.co/stockmark/stockmark-100b), a 100 billion parameter LLM developed by [Stockmark Inc.](https://stockmark.co.jp/) 

## How to use

```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

prompt_template = """### 指示:
{instruction}

### 応答:
"""

tokenizer = AutoTokenizer.from_pretrained("stockmark/stockmark-100b-instruct-v0.1")
model = AutoModelForCausalLM.from_pretrained("stockmark/stockmark-100b-instruct-v0.1", device_map="auto", torch_dtype=torch.bfloat16)

instruction = "生成AIとは?"
prompt = prompt_template.format(instruction=instruction)
input_ids = tokenizer(prompt, return_tensors="pt").input_ids.to(model.device)
with torch.inference_mode():
    tokens = model.generate(
        input_ids,
        max_new_tokens = 256,
        do_sample = True,
        temperature = 0.7,
        top_p = 0.95,
        repetition_penalty = 1.08
    )
    
output = tokenizer.decode(tokens[0], skip_special_tokens=True)
print(output)
```

## Dataset (fine-tuning)
- Ichikara instruction [[Web Page](https://liat-aip.sakura.ne.jp/wp/llm%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E6%97%A5%E6%9C%AC%E8%AA%9E%E3%82%A4%E3%83%B3%E3%82%B9%E3%83%88%E3%83%A9%E3%82%AF%E3%82%B7%E3%83%A7%E3%83%B3%E3%83%87%E3%83%BC%E3%82%BF%E4%BD%9C%E6%88%90/llm%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E6%97%A5%E6%9C%AC%E8%AA%9E%E3%82%A4%E3%83%B3%E3%82%B9%E3%83%88%E3%83%A9%E3%82%AF%E3%82%B7%E3%83%A7%E3%83%B3%E3%83%87%E3%83%BC%E3%82%BF-%E5%85%AC%E9%96%8B/)], [[Ppaer](https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/A6-3.pdf)]

## License
[MIT](https://opensource.org/licenses/MIT)

## Developed by
[Stockmark Inc.](https://stockmark.co.jp/)