File size: 1,375 Bytes
1410318
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
68a8a9f
1410318
 
 
 
68a8a9f
1410318
 
 
d41b990
1410318
d41b990
1410318
 
 
 
 
 
d41b990
 
1410318
 
11609de
 
 
 
68a8a9f
 
11609de
68a8a9f
 
11609de
 
68a8a9f
 
 
1410318
68a8a9f
 
 
1410318
68a8a9f
 
 
1410318
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
# Train

## Environment

```bash
cd scripts
python -m venv venv
source venv/bin/activate
pip install -U -r requirements.in
```

## Tokenizer

```bash
python -B train_tokenizer.py
```

## Dataset

```bash
python -B prepare_pretrain_dataset.py
python -B prepare_contrain_dataset.py
```

## Model

### Pretraining

```bash
litgpt pretrain --config ./pretrain-model.yaml
litgpt convert_from_litgpt out/pretrain/final/ out/converted_pretrain
cp config.json out/pretrain/final/
cp config.json out/converted_pretrain/
```

```python
import torch
from safetensors.torch import save_file

state_dict = torch.load('out/converted_pretrain/model.pth', map_location='cpu')
save_file(state_dict, 'out/converted_pretrain/model.safetensors')
```

### Continued Pretraining

```bash
litgpt convert_pretrained_checkpoint out/pretrain/final/ out/pretrain_checkpoint/final/
cp config.json out/pretrain_checkpoint/final/

litgpt pretrain --config ./contrain-model.yaml
litgpt convert_from_litgpt out/contrain/final/ out/converted_contrain
cp config.json out/converted_contrain/
```

```python
import torch
from safetensors.torch import save_file

state_dict = torch.load('out/converted_contrain/model.pth', map_location='cpu')
save_file(state_dict, 'out/converted_contrain/model.safetensors')
```

```bash
cp out/converted_contrain/model.pth ./
cp out/converted_contrain/model.safetensors ./
```