lora
File size: 2,127 Bytes
3dae8e7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
license: other
license_name: fair-ai-public-license-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
datasets:
- pls2000/aiart_channel_nai3_geachu
base_model:
- OnomaAIResearch/Illustrious-xl-early-release-v0
tags:
- lora
---

# Lora Training (`arcain_2411.safetensors`)
Lora trained on Illustrious-xl v0.1, but this lora can applied with other ILXL-based models such as NoobAI-XL.

- Tool: kohya-ss/sd-scripts
- GPUs: 4x RTX3060
- Dataset: pls2000/aiart_channel_nai3_geachu + additional data until 24/11/14 - blue archive data
- Time taken: 50.5 hours (walltime)


#### lora_arcain.sh
```
NCCL_P2P_DISABLE=1 NCCL_IB_DISABLE=1 accelerate launch --num_cpu_threads_per_process 4 sdxl_train_network.py \
        --network_train_unet_only \
        --network_module="networks.lora" --network_dim 128 --network_alpha 128 \
        --pretrained_model_name_or_path="/ai/data/sd/models/Stable-diffusion/SDXL/Illustrious-XL-v0.1.safetensors" \
        --dataset_config="arcain.lora.toml" \
        --output_dir="results/lora" --output_name="arcain-`date +%y%m`" \
        --save_model_as="safetensors" \
        --train_batch_size 2 --gradient_accumulation_steps 64 \
        --learning_rate=1e-5 --optimizer_type="Lion8bit" \
        --lr_scheduler="constant_with_warmup" --lr_warmup_steps 100 --optimizer_args "weight_decay=0.01" "betas=0.9,0.95" --min_snr_gamma 5 \
        --sdpa \
        --no_half_vae \
        --cache_latents --cache_latents_to_disk \
        --gradient_checkpointing \
        --full_bf16 --mixed_precision="bf16" --save_precision="fp16" \
        --ddp_timeout=10000000 \
        --max_train_epochs 8 --save_every_n_epochs 1 \
        --log_with wandb --log_tracker_name kohya-ss --wandb_run_name "arcain_`date +%y%m%d-%H%M`" --logging_dir wandb \
```

#### arcain.lora.toml
```
[general]
shuffle_caption = true
caption_tag_dropout_rate = 0.2
keep_tokens_separator = "|||"
caption_extension = ".txt"

[[datasets]]
  enable_bucket = true
  min_bucket_reso = 512
  max_bucket_reso = 4096
  resolution = 1024
  [[datasets.subsets]]
  image_dir = "/mnt/wd8tb/train/to_train"
  num_repeats = 1
```