metadata
license: mit
pipeline_tag: text-generation
library_name: transformers
language:
- en
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- eo
- es
- et
- eu
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gn
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lg
- li
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- 'no'
- ns
- om
- or
- pa
- pl
- ps
- pt
- qu
- rm
- ro
- ru
- sa
- si
- sc
- sd
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- te
- th
- tl
- tn
- tr
- ug
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zu
datasets:
- ontocord/fineweb-permissive-multilingual-2m
- distily/c4_multilingual_1M
- data-silence/sumnews
- xu-song/cc100-samples
- badrex/llm-emoji-dataset
- fblgit/simple-math
- Gusarich/math-expressions-1m
- neuralwork/arxiver
- christopher/rosetta-code
- nampdn-ai/tiny-codes
- JeanKaddour/minipile
- NousResearch/hermes-function-calling-v1
- simplescaling/s1K-1.1
- mlabonne/open-perfectblend
- allenai/tulu-3-sft-mixture
- rombodawg/Everything_Instruct_Multilingual
- open-r1/OpenR1-Math-220k
- open-thoughts/OpenThoughts-114k
- cognitivecomputations/dolphin-r1
- simplescaling/s1K-1.1
tags:
- chat
- core
- base
- instruct
- reason
tangled-alpha-0.2-core
time python -B prepare_core_datasets.py
Progress: 100%|████████| 220/220 [23:15<00:00, 6.34s/it]
Workers are finished.██| 220/220 [23:15<00:00, 6.34s/it]
Finished data processing!
i=0, block_size=8192, chunk_size=16384000, len(dataset)=893355, len(dataset) * block_size=7318364160
Total number of tokens in the optimized dataset '../core-data-0-8192-2000' is 7318364160
CUDA_VISIBLE_DEVICES=0 CUDA_LAUNCH_BLOCKING=0 PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True litgpt pretrain --config pretrain-core-model.yaml
Seed set to 23
Time to instantiate model: 0.23 seconds.
Total parameters: 226,165,248
Verifying settings ...
Measured TFLOPs: 7111.07
Epoch 1 | iter 256 step 1 | loss train: 10.531, val: n/a | iter time: 3552.77 ms (step) remaining time: 4 days, 2:53:44
Epoch 1 | iter 512 step 2 | loss train: 10.517, val: n/a | iter time: 759.61 ms (step) remaining time: 3 days, 21:53:33
Epoch 1 | iter 768 step 3 | loss train: 10.478, val: n/a | iter time: 758.59 ms (step) remaining time: 3 days, 20:06:10
Epoch 1 | iter 1024 step 4 | loss train: 10.432, val: n/a | iter time: 758.46 ms (step) remaining time: 3 days, 19:11:21
Epoch 1 | iter 1280 step 5 | loss train: 10.317, val: n/a | iter time: 757.80 ms (step) remaining time: 3 days, 18:37:07
Epoch 1 | iter 1536 step 6 | loss train: 10.203, val: n/a | iter time: 757.94 ms (step) remaining time: 3 days, 18:13:14
Epoch 1 | iter 1792 step 7 | loss train: 10.092, val: n/a | iter time: 758.36 ms (step) remaining time: 3 days, 17:55:18
Epoch 1 | iter 2048 step 8 | loss train: 9.999, val: n/a | iter time: 758.86 ms (step) remaining time: 3 days, 17:41:21
Epoch 1 | iter 2304 step 9 | loss train: 9.811, val: n/a | iter time: 756.62 ms (step) remaining time: 3 days, 17:29:46
Epoch 1 | iter 2560 step 10 | loss train: 9.700, val: n/a | iter time: 756.86 ms (step) remaining time: 3 days, 17:18:59
Epoch 1 | iter 2816 step 11 | loss train: 9.546, val: n/a | iter time: 757.33 ms (step) remaining time: 3 days, 17:09:34
Epoch 1 | iter 3072 step 12 | loss train: 9.437, val: n/a | iter time: 756.18 ms (step) remaining time: 3 days, 17:01:19
Epoch 1 | iter 3328 step 13 | loss train: 9.336, val: n/a | iter time: 759.60 ms (step) remaining time: 3 days, 16:53:49
Epoch 1 | iter 3584 step 14 | loss train: 9.240, val: n/a | iter time: 758.52 ms (step) remaining time: 3 days, 16:46:55
Epoch 1 | iter 3840 step 15 | loss train: 9.120, val: n/a | iter time: 754.31 ms (step) remaining time: 3 days, 16:40:23
Epoch 1 | iter 4096 step 16 | loss train: 9.016, val: n/a | iter time: 757.21 ms (step) remaining time: 3 days, 16:34:19
Epoch 1 | iter 4352 step 17 | loss train: 8.913, val: n/a | iter time: 754.89 ms (step) remaining time: 3 days, 16:28:34
Epoch 1 | iter 4608 step 18 | loss train: 8.854, val: n/a | iter time: 756.99 ms (step) remaining time: 3 days, 16:23:07
Epoch 1 | iter 4864 step 19 | loss train: 8.798, val: n/a | iter time: 756.30 ms (step) remaining time: 3 days, 16:17:59
Epoch 1 | iter 5120 step 20 | loss train: 8.726, val: n/a | iter time: 756.11 ms (step) remaining time: 3 days, 16:13:04
# ...
Backup wandb
:
mv wandb wandb-pretrain-core
Chat with model:
CUDA_VISIBLE_DEVICES=0 CUDA_LAUNCH_BLOCKING=0 PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True litgpt chat ../out/pretrain-core/final
CUDA_VISIBLE_DEVICES=0 CUDA_LAUNCH_BLOCKING=0 PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True time litgpt evaluate --tasks 'leaderboard' --out_dir '../evaluate/pretrain-core/leaderboard/' --batch_size 1 --dtype 'bfloat16' '../out/pretrain-core/final'
# ...