File size: 1,323 Bytes
1e0f578
 
 
 
 
 
 
 
 
 
 
 
de36b2a
1e0f578
 
 
 
 
 
 
 
1f7c1f8
 
c15ae2d
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
---
license: openrail
---

Results from an experiment using Condition Embedding Perturbation.  https://arxiv.org/pdf/2405.20494

Code is very simple, add noise during the training loop between the text_encoder forward and the Unet prediction based on this code:

        perturbation_deviation = embedding_perturbation / math.sqrt(encoder_hidden_states.shape[2])
        perturbation_delta =  torch.randn_like(encoder_hidden_states) * (perturbation_deviation)
        encoder_hidden_states = encoder_hidden_states + perturbation_delta

"embedding_perturbation" is the tunable gamma from the paper.

Models are SD1.5 trained for 30 epochs, Unet only, AdamW8bit, 2e-6 constant LR, batch size 12. EveryDream2Trainer config json included.


    #!bin/bash
    # python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 0.0" --embedding_perturbation 0.0
    python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 1.0" --embedding_perturbation 1.0
    python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 1.7" --embedding_perturbation 1.7
    python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 3.5" --embedding_perturbation 3.5

![Val_loss](val_loss_graphs.jpg)