Hhhhhao97 commited on
Commit
28a595f
·
verified ·
1 Parent(s): 91f1176

ldm_cc3m_random_noise_30

Browse files
README.md ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - DiffusionNoise
4
+ license: "apache-2.0"
5
+ datasets:
6
+ - CC3M
7
+ metrics:
8
+ - FID
9
+ - IS
10
+ base_model: "DiffusionNoise/ldm_cc3m_random_noise_30"
11
+ ---
12
+
13
+ # ldm_cc3m_random_noise_30 Model Card
14
+
15
+ This repository contains diffusion models (DMs) developed for exploring the impact of slight corruption in pre-training data on generative performance.
16
+ By introducing controlled condition corruption, we observed significant improvements in the quality, diversity, and fidelity of generated outputs across various DM architectures.
17
+ This **ldm_cc3m_random_noise_30** is trained on CC3M.
18
+
19
+ The models and resources provided here aim to advance research in robust generative modeling and inspire new approaches to data-centric AI development.
20
+ Explore the models at https://huggingface.co/DiffusionNoise.
21
+
22
+ **Note**: Since the models are trained on CC3M, which is a relatively small dataset, they might be incapable of follow complex prompts.
23
+
24
+ ## Usage
25
+
26
+ Currently, we only support a custon diffusers version.
27
+ You need to install the diffusers from here: https://github.com/Hhhhhhao/diffusers.
28
+
29
+ Install
30
+ ```
31
+ git clone https://github.com/Hhhhhhao/diffusers.git
32
+ cd diffusers
33
+ pip install -e ./
34
+
35
+ or
36
+
37
+ pip install git+https://github.com/Hhhhhhao/diffusers.git
38
+ ```
39
+
40
+ Use with (Custom) Diffusers
41
+ ```
42
+ from diffusers.pipelines.latent_diffusion.pipeline_latent_diffusion import LDMTextToImagePipeline
43
+ from diffusers DPMSolverMultistepScheduler
44
+
45
+ # load model
46
+ pipeline = LDMTextToImagePipeline.from_pretrained('DiffusionNoise/ldm_cc3m_random_noise_30')
47
+ # use DPM scheduler for faster inference
48
+ pipeline.scheduler = DPMSolverMultistepScheduler.from_config(pipeline.scheduler.config)
49
+ # use float 16
50
+ pipeline = pipeline.to('cuda')
51
+ pipeline = pipeline.to(torch.float16)
52
+
53
+ # inference
54
+ prompt = 'A photograph of a modern kitchen with appliances'
55
+ batch_size = 1
56
+ num_inference_steps=25
57
+ guidance_scale=2.5
58
+ images = pipeline([prompt] * batch_size, num_inference_steps=num_inference_steps, guidance_scale=guidance_scale).images
59
+
60
+ images[0].save("img.png")
61
+ ```
62
+
63
+ More comprehensive usage scripts can be found here: https://github.com/Hhhhhhao/DiffusionNoise
64
+
65
+
66
+ ### Direct Use
67
+
68
+ The model is intended for research purposes only. Possible research areas and tasks include:
69
+ * Safe deployment of models which have the potential to generate harmful content.
70
+ * Probing and understanding the limitations and biases of generative models.
71
+ * Generation of artworks and use in design and other artistic processes.
72
+ * Applications in educational or creative tools.
73
+ * Research on generative models.
74
+ * Excluded uses are described below.
75
+
76
+ ### Misuse, Malicious Use, and Out-of-Scope Use
77
+
78
+ The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
79
+
80
+ The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
81
+
82
+ Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
83
+ * Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
84
+ * Intentionally promoting or propagating discriminatory content or harmful stereotypes.
85
+ * Impersonating individuals without their consent.
86
+ * Sexual content without consent of the people who might see it.
87
+ * Mis- and disinformation
88
+ * Representations of egregious violence and gore
89
+ * Sharing of copyrighted or licensed material in violation of its terms of use.
90
+ * Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
91
+
92
+ ## Citation
93
+ ```
94
+ @article{chen2024slight,
95
+ title={Slight Corruption in Pre-training Data Makes Better Diffusion Models},
96
+ author={Chen, Hao and Han, Yujin and Misra, Diganta and Li, Xiang and Hu, Kai and Zou, Difan and Sugiyama, Masashi and Wang, Jindong and Raj, Bhiksha},
97
+ booktitle={Neural Information Processing Systems (NeurIPS)},
98
+ year={2024}
99
+ }
100
+ ```
101
+
bert/config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bert-base-uncased",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.40.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
bert/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a3d803e47c9e598e14a230a4149dde79f48db19f96f9bbbe105333da944837a
3
+ size 437951328
model_index.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "LDMTextToImagePipeline",
3
+ "_diffusers_version": "0.27.0.dev0",
4
+ "bert": [
5
+ "transformers",
6
+ "BertModel"
7
+ ],
8
+ "scheduler": [
9
+ "diffusers",
10
+ "DDIMScheduler"
11
+ ],
12
+ "tokenizer": [
13
+ "transformers",
14
+ "BertTokenizerFast"
15
+ ],
16
+ "unet": [
17
+ "diffusers",
18
+ "UNet2DConditionModel"
19
+ ],
20
+ "vqvae": [
21
+ "diffusers",
22
+ "VQModel"
23
+ ]
24
+ }
scheduler/scheduler_config.json ADDED
@@ -0,0 +1,1020 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "DDIMScheduler",
3
+ "_diffusers_version": "0.27.0.dev0",
4
+ "beta_end": 0.012,
5
+ "beta_schedule": "linear",
6
+ "beta_start": 0.00085,
7
+ "clip_sample": false,
8
+ "clip_sample_range": 1.0,
9
+ "dynamic_thresholding_ratio": 0.995,
10
+ "num_train_timesteps": 1000,
11
+ "prediction_type": "epsilon",
12
+ "rescale_betas_zero_snr": false,
13
+ "sample_max_value": 1.0,
14
+ "set_alpha_to_one": false,
15
+ "steps_offset": 1,
16
+ "thresholding": false,
17
+ "timestep_spacing": "leading",
18
+ "trained_betas": [
19
+ 0.0008500000112690032,
20
+ 0.0008546986500732601,
21
+ 0.0008594102691859007,
22
+ 0.0008641348103992641,
23
+ 0.0008688723319210112,
24
+ 0.0008736227755434811,
25
+ 0.0008783861994743347,
26
+ 0.0008831625455059111,
27
+ 0.0008879518718458712,
28
+ 0.0008927541202865541,
29
+ 0.0008975693490356207,
30
+ 0.0009023974998854101,
31
+ 0.0009072386310435832,
32
+ 0.000912092684302479,
33
+ 0.0009169597178697586,
34
+ 0.000921839673537761,
35
+ 0.000926732609514147,
36
+ 0.0009316384675912559,
37
+ 0.0009365573059767485,
38
+ 0.0009414890664629638,
39
+ 0.0009464338072575629,
40
+ 0.0009513914701528847,
41
+ 0.0009563620551489294,
42
+ 0.0009613456786610186,
43
+ 0.0009663421660661697,
44
+ 0.0009713516337797046,
45
+ 0.0009763740818016231,
46
+ 0.0009814094519242644,
47
+ 0.0009864578023552895,
48
+ 0.0009915190748870373,
49
+ 0.0009965932695195079,
50
+ 0.0010016805026680231,
51
+ 0.0010067806579172611,
52
+ 0.001011893735267222,
53
+ 0.0010170197347179055,
54
+ 0.0010221587726846337,
55
+ 0.001027310616336763,
56
+ 0.0010324756149202585,
57
+ 0.001037653419189155,
58
+ 0.0010428441455587745,
59
+ 0.0010480479104444385,
60
+ 0.0010532645974308252,
61
+ 0.0010584942065179348,
62
+ 0.001063736854121089,
63
+ 0.001068992423824966,
64
+ 0.0010742609156295657,
65
+ 0.0010795423295348883,
66
+ 0.0010848367819562554,
67
+ 0.0010901440400630236,
68
+ 0.0010954643366858363,
69
+ 0.0011007976718246937,
70
+ 0.001106143812648952,
71
+ 0.001111502991989255,
72
+ 0.0011168750934302807,
73
+ 0.0011222601169720292,
74
+ 0.0011276581790298223,
75
+ 0.0011330691631883383,
76
+ 0.001138493069447577,
77
+ 0.0011439298978075385,
78
+ 0.0011493796482682228,
79
+ 0.0011548424372449517,
80
+ 0.0011603181483224034,
81
+ 0.001165806781500578,
82
+ 0.001171308453194797,
83
+ 0.0011768229305744171,
84
+ 0.0011823504464700818,
85
+ 0.0011878910008817911,
86
+ 0.0011934443609789014,
87
+ 0.0011990107595920563,
88
+ 0.001204590080305934,
89
+ 0.0012101823231205344,
90
+ 0.0012157876044511795,
91
+ 0.0012214056914672256,
92
+ 0.0012270368169993162,
93
+ 0.0012326808646321297,
94
+ 0.0012383379507809877,
95
+ 0.0012440079590305686,
96
+ 0.0012496908893808722,
97
+ 0.0012553867418318987,
98
+ 0.001261095516383648,
99
+ 0.0012668173294514418,
100
+ 0.0012725520646199584,
101
+ 0.0012782997218891978,
102
+ 0.00128406030125916,
103
+ 0.0012898339191451669,
104
+ 0.0012956204591318965,
105
+ 0.001301419921219349,
106
+ 0.001307232421822846,
107
+ 0.001313057728111744,
108
+ 0.0013188960729166865,
109
+ 0.001324747339822352,
110
+ 0.001330611645244062,
111
+ 0.0013364888727664948,
112
+ 0.0013423789059743285,
113
+ 0.0013482820941135287,
114
+ 0.00135419808793813,
115
+ 0.0013601271202787757,
116
+ 0.0013660690747201443,
117
+ 0.0013720239512622356,
118
+ 0.0013779917499050498,
119
+ 0.0013839725870639086,
120
+ 0.0013899663463234901,
121
+ 0.0013959730276837945,
122
+ 0.0014019926311448216,
123
+ 0.0014080252731218934,
124
+ 0.001414070837199688,
125
+ 0.0014201293233782053,
126
+ 0.0014262007316574454,
127
+ 0.0014322851784527302,
128
+ 0.0014383825473487377,
129
+ 0.001444492838345468,
130
+ 0.0014506160514429212,
131
+ 0.001456752303056419,
132
+ 0.0014629014767706394,
133
+ 0.0014690635725855827,
134
+ 0.0014752385905012488,
135
+ 0.0014814266469329596,
136
+ 0.001487627625465393,
137
+ 0.0014938415260985494,
138
+ 0.0015000683488324285,
139
+ 0.0015063082100823522,
140
+ 0.0015125608770176768,
141
+ 0.001518826698884368,
142
+ 0.00152510532643646,
143
+ 0.0015313969925045967,
144
+ 0.0015377014642581344,
145
+ 0.0015440189745277166,
146
+ 0.0015503495233133435,
147
+ 0.0015566928777843714,
148
+ 0.0015630492707714438,
149
+ 0.001569418585859239,
150
+ 0.001575800939463079,
151
+ 0.0015821960987523198,
152
+ 0.0015886042965576053,
153
+ 0.0015950254164636135,
154
+ 0.0016014594584703445,
155
+ 0.0016079065389931202,
156
+ 0.0016143665416166186,
157
+ 0.0016208394663408399,
158
+ 0.0016273253131657839,
159
+ 0.0016338241985067725,
160
+ 0.0016403358895331621,
161
+ 0.0016468606190755963,
162
+ 0.0016533983871340752,
163
+ 0.001659948960877955,
164
+ 0.0016665125731378794,
165
+ 0.0016730891074985266,
166
+ 0.0016796785639598966,
167
+ 0.0016862810589373112,
168
+ 0.0016928964760154486,
169
+ 0.0016995248151943088,
170
+ 0.0017061660764738917,
171
+ 0.0017128202598541975,
172
+ 0.0017194874817505479,
173
+ 0.001726167625747621,
174
+ 0.001732860691845417,
175
+ 0.0017395667964592576,
176
+ 0.001746285823173821,
177
+ 0.0017530177719891071,
178
+ 0.001759762642905116,
179
+ 0.0017665204359218478,
180
+ 0.0017732912674546242,
181
+ 0.0017800750210881233,
182
+ 0.0017868716968223453,
183
+ 0.0017936814110726118,
184
+ 0.0018005040474236012,
185
+ 0.0018073396058753133,
186
+ 0.0018141880864277482,
187
+ 0.001821049489080906,
188
+ 0.0018279239302501082,
189
+ 0.0018348112935200334,
190
+ 0.0018417115788906813,
191
+ 0.0018486249027773738,
192
+ 0.0018555510323494673,
193
+ 0.0018624902004376054,
194
+ 0.001869442407041788,
195
+ 0.0018764074193313718,
196
+ 0.001883385470137,
197
+ 0.0018903764430433512,
198
+ 0.001897380338050425,
199
+ 0.0019043971551582217,
200
+ 0.001911427010782063,
201
+ 0.001918469788506627,
202
+ 0.001925525488331914,
203
+ 0.0019325942266732454,
204
+ 0.0019396757706999779,
205
+ 0.001946770353242755,
206
+ 0.0019538779743015766,
207
+ 0.0019609984010457993,
208
+ 0.0019681318663060665,
209
+ 0.0019752781372517347,
210
+ 0.0019824374467134476,
211
+ 0.001989609794691205,
212
+ 0.0019967949483543634,
213
+ 0.0020039931405335665,
214
+ 0.002011204371228814,
215
+ 0.0020184284076094627,
216
+ 0.002025665482506156,
217
+ 0.00203291536308825,
218
+ 0.002040178282186389,
219
+ 0.0020474542398005724,
220
+ 0.002054743003100157,
221
+ 0.002062044804915786,
222
+ 0.0020693596452474594,
223
+ 0.002076687291264534,
224
+ 0.002084027975797653,
225
+ 0.0020913814660161734,
226
+ 0.002098747994750738,
227
+ 0.0021061275620013475,
228
+ 0.002113519934937358,
229
+ 0.002120925346389413,
230
+ 0.0021283437963575125,
231
+ 0.002135775052011013,
232
+ 0.002143219346180558,
233
+ 0.0021506764460355043,
234
+ 0.002158146584406495,
235
+ 0.0021656297612935305,
236
+ 0.002173125743865967,
237
+ 0.0021806347649544477,
238
+ 0.0021881568245589733,
239
+ 0.0021956916898489,
240
+ 0.002203239593654871,
241
+ 0.002210800303146243,
242
+ 0.00221837405115366,
243
+ 0.002225960837677121,
244
+ 0.0022335604298859835,
245
+ 0.0022411730606108904,
246
+ 0.002248798729851842,
247
+ 0.0022564372047781944,
248
+ 0.0022640887182205915,
249
+ 0.0022717530373483896,
250
+ 0.0022794303949922323,
251
+ 0.0022871207911521196,
252
+ 0.002294823992997408,
253
+ 0.002302540233358741,
254
+ 0.0023102692794054747,
255
+ 0.002318011596798897,
256
+ 0.0023257664870470762,
257
+ 0.002333534648641944,
258
+ 0.0023413156159222126,
259
+ 0.0023491093888878822,
260
+ 0.00235691643320024,
261
+ 0.002364736283197999,
262
+ 0.002372568938881159,
263
+ 0.0023804146330803633,
264
+ 0.0023882733657956123,
265
+ 0.0023961449041962624,
266
+ 0.002404029481112957,
267
+ 0.0024119270965456963,
268
+ 0.0024198375176638365,
269
+ 0.0024277609772980213,
270
+ 0.0024356974754482508,
271
+ 0.002443646779283881,
272
+ 0.0024516091216355562,
273
+ 0.0024595842696726322,
274
+ 0.002467572456225753,
275
+ 0.002475573681294918,
276
+ 0.0024835877120494843,
277
+ 0.002491614781320095,
278
+ 0.002499654656276107,
279
+ 0.0025077075697481632,
280
+ 0.0025157735217362642,
281
+ 0.002523852279409766,
282
+ 0.0025319440755993128,
283
+ 0.002540048910304904,
284
+ 0.002548166550695896,
285
+ 0.002556297229602933,
286
+ 0.0025644409470260143,
287
+ 0.0025725974701344967,
288
+ 0.0025807670317590237,
289
+ 0.0025889493990689516,
290
+ 0.002597144804894924,
291
+ 0.0026053530164062977,
292
+ 0.0026135744992643595,
293
+ 0.0026218087878078222,
294
+ 0.002630055882036686,
295
+ 0.0026383160147815943,
296
+ 0.0026465891860425472,
297
+ 0.002654875162988901,
298
+ 0.0026631741784512997,
299
+ 0.002671486232429743,
300
+ 0.002679811092093587,
301
+ 0.0026881489902734756,
302
+ 0.0026964996941387653,
303
+ 0.0027048634365200996,
304
+ 0.0027132402174174786,
305
+ 0.0027216298040002584,
306
+ 0.002730032429099083,
307
+ 0.002738448092713952,
308
+ 0.002746876562014222,
309
+ 0.002755318069830537,
310
+ 0.002763772616162896,
311
+ 0.0027722399681806564,
312
+ 0.0027807201258838177,
313
+ 0.002789213554933667,
314
+ 0.0027977197896689177,
315
+ 0.002806238830089569,
316
+ 0.002814770909026265,
317
+ 0.002823316026479006,
318
+ 0.002831874182447791,
319
+ 0.0028404451441019773,
320
+ 0.0028490289114415646,
321
+ 0.00285762595012784,
322
+ 0.0028662357944995165,
323
+ 0.002874858444556594,
324
+ 0.002883494133129716,
325
+ 0.0028921428602188826,
326
+ 0.002900804625824094,
327
+ 0.002909479197114706,
328
+ 0.0029181665740907192,
329
+ 0.0029268672224134207,
330
+ 0.002935580676421523,
331
+ 0.0029443069361150265,
332
+ 0.0029530462343245745,
333
+ 0.002961798571050167,
334
+ 0.0029705639462918043,
335
+ 0.0029793421272188425,
336
+ 0.0029881331138312817,
337
+ 0.002996937371790409,
338
+ 0.0030057544354349375,
339
+ 0.003014584304764867,
340
+ 0.003023427212610841,
341
+ 0.0030322831589728594,
342
+ 0.0030411521438509226,
343
+ 0.0030500339344143867,
344
+ 0.003058928530663252,
345
+ 0.0030678363982588053,
346
+ 0.003076756838709116,
347
+ 0.003085690550506115,
348
+ 0.003094637067988515,
349
+ 0.0031035966239869595,
350
+ 0.003112568985670805,
351
+ 0.003121554385870695,
352
+ 0.00313055282458663,
353
+ 0.0031395640689879656,
354
+ 0.003148588351905346,
355
+ 0.003157625673338771,
356
+ 0.0031666758004575968,
357
+ 0.0031757389660924673,
358
+ 0.003184814937412739,
359
+ 0.003193903947249055,
360
+ 0.0032030059956014156,
361
+ 0.0032121208496391773,
362
+ 0.0032212487421929836,
363
+ 0.003230389440432191,
364
+ 0.0032395434100180864,
365
+ 0.0032487099524587393,
366
+ 0.0032578897662460804,
367
+ 0.0032670823857188225,
368
+ 0.0032762878108769655,
369
+ 0.003285506507381797,
370
+ 0.003294738009572029,
371
+ 0.0033039823174476624,
372
+ 0.00331323966383934,
373
+ 0.0033225100487470627,
374
+ 0.003331793239340186,
375
+ 0.003341089468449354,
376
+ 0.003350398736074567,
377
+ 0.0033597208093851805,
378
+ 0.0033690559212118387,
379
+ 0.0033784040715545416,
380
+ 0.0033877650275826454,
381
+ 0.00339713878929615,
382
+ 0.0034065258223563433,
383
+ 0.0034159256611019373,
384
+ 0.0034253383055329323,
385
+ 0.0034347642213106155,
386
+ 0.003444202709943056,
387
+ 0.003453654469922185,
388
+ 0.0034631190355867147,
389
+ 0.003472596639767289,
390
+ 0.0034820870496332645,
391
+ 0.0034915904980152845,
392
+ 0.003501106984913349,
393
+ 0.0035106362774968147,
394
+ 0.003520178608596325,
395
+ 0.003529733745381236,
396
+ 0.003539301920682192,
397
+ 0.0035488831344991922,
398
+ 0.0035584773868322372,
399
+ 0.0035680842120200396,
400
+ 0.00357770430855453,
401
+ 0.0035873372107744217,
402
+ 0.003596983151510358,
403
+ 0.0036066421307623386,
404
+ 0.0036163139156997204,
405
+ 0.003625998506322503,
406
+ 0.003635696368291974,
407
+ 0.003645407035946846,
408
+ 0.003655130509287119,
409
+ 0.00366486725397408,
410
+ 0.0036746165715157986,
411
+ 0.0036843791604042053,
412
+ 0.003694154554978013,
413
+ 0.0037039429880678654,
414
+ 0.0037137442268431187,
415
+ 0.0037235585041344166,
416
+ 0.0037333855871111155,
417
+ 0.0037432259414345026,
418
+ 0.003753078868612647,
419
+ 0.0037629450671374798,
420
+ 0.0037728240713477135,
421
+ 0.0037827161140739918,
422
+ 0.003792620962485671,
423
+ 0.003802538849413395,
424
+ 0.0038124697748571634,
425
+ 0.003822413505986333,
426
+ 0.003832370275631547,
427
+ 0.003842339850962162,
428
+ 0.0038523224648088217,
429
+ 0.003862318117171526,
430
+ 0.003872326575219631,
431
+ 0.003882348071783781,
432
+ 0.003892382374033332,
433
+ 0.003902429947629571,
434
+ 0.003912490326911211,
435
+ 0.003922563511878252,
436
+ 0.003932649735361338,
437
+ 0.003942748997360468,
438
+ 0.0039528608322143555,
439
+ 0.003962986171245575,
440
+ 0.003973124083131552,
441
+ 0.003983275033533573,
442
+ 0.003993439022451639,
443
+ 0.00400361604988575,
444
+ 0.004013805650174618,
445
+ 0.004024008754640818,
446
+ 0.004034224431961775,
447
+ 0.004044453147798777,
448
+ 0.004054694902151823,
449
+ 0.004064949229359627,
450
+ 0.004075217060744762,
451
+ 0.004085497464984655,
452
+ 0.004095790907740593,
453
+ 0.004106097389012575,
454
+ 0.004116416443139315,
455
+ 0.004126749001443386,
456
+ 0.004137094132602215,
457
+ 0.004147452302277088,
458
+ 0.004157823510468006,
459
+ 0.004168207757174969,
460
+ 0.004178604576736689,
461
+ 0.004189014434814453,
462
+ 0.0041994377970695496,
463
+ 0.004209873732179403,
464
+ 0.004220322240144014,
465
+ 0.004230784252285957,
466
+ 0.004241258837282658,
467
+ 0.0042517464607954025,
468
+ 0.004262247122824192,
469
+ 0.004272760823369026,
470
+ 0.004283287562429905,
471
+ 0.004293826874345541,
472
+ 0.004304379690438509,
473
+ 0.004314945079386234,
474
+ 0.004325523506850004,
475
+ 0.004336114507168531,
476
+ 0.0043467190116643906,
477
+ 0.004357336089015007,
478
+ 0.004367966204881668,
479
+ 0.004378609359264374,
480
+ 0.004389265552163124,
481
+ 0.004399934317916632,
482
+ 0.004410616587847471,
483
+ 0.004421311430633068,
484
+ 0.0044320193119347095,
485
+ 0.004442740231752396,
486
+ 0.004453473724424839,
487
+ 0.004464220721274614,
488
+ 0.004474980290979147,
489
+ 0.004485752899199724,
490
+ 0.004496538545936346,
491
+ 0.0045073372311890125,
492
+ 0.004518148489296436,
493
+ 0.004528972785919905,
494
+ 0.004539810586720705,
495
+ 0.004550660494714975,
496
+ 0.004561523906886578,
497
+ 0.0045724003575742245,
498
+ 0.004583289381116629,
499
+ 0.0045941914431750774,
500
+ 0.004605106543749571,
501
+ 0.004616034682840109,
502
+ 0.0046269758604466915,
503
+ 0.0046379296109080315,
504
+ 0.004648896399885416,
505
+ 0.004659876227378845,
506
+ 0.004670869093388319,
507
+ 0.004681874997913837,
508
+ 0.004692893475294113,
509
+ 0.004703925456851721,
510
+ 0.004714970011264086,
511
+ 0.004726027604192495,
512
+ 0.004737097769975662,
513
+ 0.004748181439936161,
514
+ 0.004759277682751417,
515
+ 0.004770387429744005,
516
+ 0.004781509283930063,
517
+ 0.004792644642293453,
518
+ 0.004803793039172888,
519
+ 0.00481495400890708,
520
+ 0.0048261284828186035,
521
+ 0.004837315529584885,
522
+ 0.004848515149205923,
523
+ 0.0048597282730042934,
524
+ 0.004870954435318708,
525
+ 0.004882193170487881,
526
+ 0.004893444944173098,
527
+ 0.004904709756374359,
528
+ 0.004915987607091665,
529
+ 0.004927278030663729,
530
+ 0.004938581492751837,
531
+ 0.004949898459017277,
532
+ 0.004961227998137474,
533
+ 0.004972570110112429,
534
+ 0.004983925726264715,
535
+ 0.004995293915271759,
536
+ 0.0050066751427948475,
537
+ 0.005018069874495268,
538
+ 0.005029476713389158,
539
+ 0.0050408970564603806,
540
+ 0.00505232997238636,
541
+ 0.005063776392489672,
542
+ 0.0050752353854477406,
543
+ 0.005086707416921854,
544
+ 0.005098192021250725,
545
+ 0.0051096901297569275,
546
+ 0.0051212008111178875,
547
+ 0.005132724530994892,
548
+ 0.005144261289387941,
549
+ 0.005155811086297035,
550
+ 0.005167373921722174,
551
+ 0.0051789493300020695,
552
+ 0.00519053777679801,
553
+ 0.005202139262109995,
554
+ 0.0052137537859380245,
555
+ 0.005225381348282099,
556
+ 0.00523702148348093,
557
+ 0.005248675122857094,
558
+ 0.005260341335088015,
559
+ 0.005272020120173693,
560
+ 0.005283712409436703,
561
+ 0.005295417737215757,
562
+ 0.005307135637849569,
563
+ 0.005318866576999426,
564
+ 0.005330610554665327,
565
+ 0.005342367570847273,
566
+ 0.005354137159883976,
567
+ 0.005365920253098011,
568
+ 0.005377715919166803,
569
+ 0.00538952462375164,
570
+ 0.005401346366852522,
571
+ 0.005413180682808161,
572
+ 0.005425028502941132,
573
+ 0.00543688889592886,
574
+ 0.0054487623274326324,
575
+ 0.00546064879745245,
576
+ 0.005472548305988312,
577
+ 0.005484460387378931,
578
+ 0.005496385972946882,
579
+ 0.005508324131369591,
580
+ 0.005520275328308344,
581
+ 0.005532239098101854,
582
+ 0.005544216372072697,
583
+ 0.005556206218898296,
584
+ 0.005568209104239941,
585
+ 0.0055802250280976295,
586
+ 0.005592253990471363,
587
+ 0.005604295991361141,
588
+ 0.005616350565105677,
589
+ 0.005628418643027544,
590
+ 0.005640499293804169,
591
+ 0.005652592517435551,
592
+ 0.005664699245244265,
593
+ 0.005676819011569023,
594
+ 0.005688951350748539,
595
+ 0.005701096728444099,
596
+ 0.0057132551446557045,
597
+ 0.005725426599383354,
598
+ 0.005737610626965761,
599
+ 0.0057498081587255,
600
+ 0.005762018263339996,
601
+ 0.005774241406470537,
602
+ 0.005786477588117123,
603
+ 0.005798726342618465,
604
+ 0.00581098860129714,
605
+ 0.005823263432830572,
606
+ 0.005835551302880049,
607
+ 0.00584785221144557,
608
+ 0.0058601656928658485,
609
+ 0.005872492678463459,
610
+ 0.005884832236915827,
611
+ 0.005897184833884239,
612
+ 0.005909550469368696,
613
+ 0.005921929143369198,
614
+ 0.005934320390224457,
615
+ 0.005946725141257048,
616
+ 0.005959142465144396,
617
+ 0.005971572827547789,
618
+ 0.005984016228467226,
619
+ 0.005996472202241421,
620
+ 0.006008941680192947,
621
+ 0.006021423730999231,
622
+ 0.00603391882032156,
623
+ 0.006046426948159933,
624
+ 0.006058948114514351,
625
+ 0.006071481853723526,
626
+ 0.006084028631448746,
627
+ 0.00609658844769001,
628
+ 0.006109161302447319,
629
+ 0.006121747195720673,
630
+ 0.006134346127510071,
631
+ 0.006146957632154226,
632
+ 0.006159582175314426,
633
+ 0.006172219756990671,
634
+ 0.0061848703771829605,
635
+ 0.006197533570230007,
636
+ 0.006210210267454386,
637
+ 0.006222899537533522,
638
+ 0.006235601846128702,
639
+ 0.006248317193239927,
640
+ 0.00626104511320591,
641
+ 0.006273786537349224,
642
+ 0.006286540534347296,
643
+ 0.006299307569861412,
644
+ 0.006312087643891573,
645
+ 0.0063248807564377785,
646
+ 0.006337686441838741,
647
+ 0.006350505631417036,
648
+ 0.006363337393850088,
649
+ 0.006376182194799185,
650
+ 0.006389040034264326,
651
+ 0.006401910446584225,
652
+ 0.006414794363081455,
653
+ 0.006427690852433443,
654
+ 0.0064406003803014755,
655
+ 0.006453522946685553,
656
+ 0.006466458085924387,
657
+ 0.006479406729340553,
658
+ 0.006492367945611477,
659
+ 0.006505342200398445,
660
+ 0.006518329493701458,
661
+ 0.0065313298255205154,
662
+ 0.00654434273019433,
663
+ 0.00655736867338419,
664
+ 0.006570408120751381,
665
+ 0.0065834601409733295,
666
+ 0.0065965247340500355,
667
+ 0.006609602831304073,
668
+ 0.0066226935014128685,
669
+ 0.006635797210037708,
670
+ 0.006648913957178593,
671
+ 0.006662043742835522,
672
+ 0.006675186567008495,
673
+ 0.006688341964036226,
674
+ 0.006701510865241289,
675
+ 0.006714692339301109,
676
+ 0.006727886851876974,
677
+ 0.006741093937307596,
678
+ 0.00675431452691555,
679
+ 0.0067675476893782616,
680
+ 0.0067807938903570175,
681
+ 0.006794053129851818,
682
+ 0.006807325407862663,
683
+ 0.006820610258728266,
684
+ 0.0068339086137712,
685
+ 0.006847219541668892,
686
+ 0.006860543508082628,
687
+ 0.006873880513012409,
688
+ 0.0068872300907969475,
689
+ 0.006900593172758818,
690
+ 0.006913968827575445,
691
+ 0.006927357520908117,
692
+ 0.006940759252756834,
693
+ 0.006954174023121595,
694
+ 0.006967601366341114,
695
+ 0.006981041748076677,
696
+ 0.0069944956339895725,
697
+ 0.007007961627095938,
698
+ 0.007021441124379635,
699
+ 0.007034933660179377,
700
+ 0.007048438768833876,
701
+ 0.007061956916004419,
702
+ 0.007075488101691008,
703
+ 0.0070890323258936405,
704
+ 0.007102589588612318,
705
+ 0.007116159424185753,
706
+ 0.007129742298275232,
707
+ 0.007143338210880756,
708
+ 0.007156947162002325,
709
+ 0.007170569151639938,
710
+ 0.007184203714132309,
711
+ 0.0071978517808020115,
712
+ 0.007211512420326471,
713
+ 0.007225186098366976,
714
+ 0.0072388723492622375,
715
+ 0.007252572104334831,
716
+ 0.007266284432262182,
717
+ 0.007280009798705578,
718
+ 0.007293748203665018,
719
+ 0.007307499647140503,
720
+ 0.007321264129132032,
721
+ 0.007335041183978319,
722
+ 0.0073488312773406506,
723
+ 0.0073626344092190266,
724
+ 0.007376450579613447,
725
+ 0.007390279788523912,
726
+ 0.007404121570289135,
727
+ 0.0074179768562316895,
728
+ 0.007431844715029001,
729
+ 0.007445725612342358,
730
+ 0.007459619082510471,
731
+ 0.007473526056855917,
732
+ 0.00748744560405612,
733
+ 0.0075013781897723675,
734
+ 0.00751532381400466,
735
+ 0.0075292824767529964,
736
+ 0.007543254178017378,
737
+ 0.007557238452136517,
738
+ 0.0075712357647717,
739
+ 0.007585246115922928,
740
+ 0.0075992695055902,
741
+ 0.007613305933773518,
742
+ 0.007627354934811592,
743
+ 0.0076414174400269985,
744
+ 0.007655492518097162,
745
+ 0.007669580634683371,
746
+ 0.007683681324124336,
747
+ 0.007697795517742634,
748
+ 0.007711922284215689,
749
+ 0.007726062089204788,
750
+ 0.007740214932709932,
751
+ 0.007754380814731121,
752
+ 0.007768559735268354,
753
+ 0.007782751228660345,
754
+ 0.00779695576056838,
755
+ 0.00781117333099246,
756
+ 0.007825404405593872,
757
+ 0.007839647121727467,
758
+ 0.00785390380769968,
759
+ 0.007868173532187939,
760
+ 0.007882455363869667,
761
+ 0.007896751165390015,
762
+ 0.007911059074103832,
763
+ 0.007925380021333694,
764
+ 0.007939714007079601,
765
+ 0.007954061031341553,
766
+ 0.007968421094119549,
767
+ 0.00798279419541359,
768
+ 0.007997180335223675,
769
+ 0.008011579513549805,
770
+ 0.008025990799069405,
771
+ 0.008040416054427624,
772
+ 0.008054853416979313,
773
+ 0.008069304749369621,
774
+ 0.0080837681889534,
775
+ 0.008098244667053223,
776
+ 0.00811273418366909,
777
+ 0.008127236738801003,
778
+ 0.00814175233244896,
779
+ 0.00815628096461296,
780
+ 0.008170821703970432,
781
+ 0.008185376413166523,
782
+ 0.008199943229556084,
783
+ 0.008214524015784264,
784
+ 0.008229116909205914,
785
+ 0.008243722841143608,
786
+ 0.008258341811597347,
787
+ 0.008272973820567131,
788
+ 0.00828761886805296,
789
+ 0.008302276954054832,
790
+ 0.00831694807857275,
791
+ 0.008331631310284138,
792
+ 0.008346328511834145,
793
+ 0.008361037820577621,
794
+ 0.008375760167837143,
795
+ 0.008390496484935284,
796
+ 0.008405244909226894,
797
+ 0.00842000637203455,
798
+ 0.00843478087335825,
799
+ 0.008449568413197994,
800
+ 0.008464368991553783,
801
+ 0.008479181677103043,
802
+ 0.008494008332490921,
803
+ 0.00850884709507227,
804
+ 0.008523699827492237,
805
+ 0.008538564667105675,
806
+ 0.008553442545235157,
807
+ 0.008568333461880684,
808
+ 0.008583237417042255,
809
+ 0.008598154410719872,
810
+ 0.008613084442913532,
811
+ 0.008628027513623238,
812
+ 0.008642982691526413,
813
+ 0.008657951839268208,
814
+ 0.008672934025526047,
815
+ 0.008687928318977356,
816
+ 0.00870293565094471,
817
+ 0.008717956021428108,
818
+ 0.008732989430427551,
819
+ 0.008748035877943039,
820
+ 0.008763095363974571,
821
+ 0.008778167888522148,
822
+ 0.00879325345158577,
823
+ 0.008808351121842861,
824
+ 0.008823462761938572,
825
+ 0.008838586509227753,
826
+ 0.008853724226355553,
827
+ 0.008868874050676823,
828
+ 0.008884036913514137,
829
+ 0.008899212814867496,
830
+ 0.0089144017547369,
831
+ 0.008929603733122349,
832
+ 0.008944818750023842,
833
+ 0.008960045874118805,
834
+ 0.008975286968052387,
835
+ 0.00899054016917944,
836
+ 0.009005807340145111,
837
+ 0.009021086618304253,
838
+ 0.009036378934979439,
839
+ 0.00905168429017067,
840
+ 0.009067002683877945,
841
+ 0.009082334116101265,
842
+ 0.00909767858684063,
843
+ 0.009113036096096039,
844
+ 0.009128405712544918,
845
+ 0.009143789298832417,
846
+ 0.009159184992313385,
847
+ 0.009174594655632973,
848
+ 0.00919001642614603,
849
+ 0.009205451235175133,
850
+ 0.00922089908272028,
851
+ 0.009236359968781471,
852
+ 0.009251833893358707,
853
+ 0.009267320856451988,
854
+ 0.009282820858061314,
855
+ 0.009298332966864109,
856
+ 0.009313859045505524,
857
+ 0.009329397231340408,
858
+ 0.009344948455691338,
859
+ 0.009360513649880886,
860
+ 0.009376090951263905,
861
+ 0.009391681291162968,
862
+ 0.009407284669578075,
863
+ 0.009422901086509228,
864
+ 0.00943852961063385,
865
+ 0.009454172104597092,
866
+ 0.009469827637076378,
867
+ 0.009485495276749134,
868
+ 0.009501175954937935,
869
+ 0.009516870602965355,
870
+ 0.009532577358186245,
871
+ 0.00954829715192318,
872
+ 0.009564029984176159,
873
+ 0.009579775854945183,
874
+ 0.009595534764230251,
875
+ 0.009611306712031364,
876
+ 0.009627090767025948,
877
+ 0.00964288879185915,
878
+ 0.009658698923885822,
879
+ 0.00967452209442854,
880
+ 0.009690359234809875,
881
+ 0.009706208482384682,
882
+ 0.009722070768475533,
883
+ 0.009737946093082428,
884
+ 0.009753834456205368,
885
+ 0.009769735857844353,
886
+ 0.009785649366676807,
887
+ 0.009801576845347881,
888
+ 0.009817516431212425,
889
+ 0.009833469986915588,
890
+ 0.009849435649812222,
891
+ 0.0098654143512249,
892
+ 0.009881406091153622,
893
+ 0.009897410869598389,
894
+ 0.0099134286865592,
895
+ 0.009929459542036057,
896
+ 0.009945503436028957,
897
+ 0.009961560368537903,
898
+ 0.009977629408240318,
899
+ 0.009993712417781353,
900
+ 0.010009807534515858,
901
+ 0.010025915689766407,
902
+ 0.010042036883533001,
903
+ 0.01005817111581564,
904
+ 0.010074318386614323,
905
+ 0.01009047869592905,
906
+ 0.010106652043759823,
907
+ 0.01012283843010664,
908
+ 0.010139036923646927,
909
+ 0.010155249387025833,
910
+ 0.01017147395759821,
911
+ 0.010187712498009205,
912
+ 0.01020396314561367,
913
+ 0.01022022683173418,
914
+ 0.010236503556370735,
915
+ 0.010252793319523335,
916
+ 0.010269096121191978,
917
+ 0.010285411961376667,
918
+ 0.010301739908754826,
919
+ 0.010318081825971603,
920
+ 0.010334436781704426,
921
+ 0.010350803844630718,
922
+ 0.010367183946073055,
923
+ 0.010383577086031437,
924
+ 0.010399984195828438,
925
+ 0.010416403412818909,
926
+ 0.01043283473700285,
927
+ 0.01044928003102541,
928
+ 0.010465738363564014,
929
+ 0.010482209734618664,
930
+ 0.010498693212866783,
931
+ 0.010515190660953522,
932
+ 0.01053170021623373,
933
+ 0.010548222810029984,
934
+ 0.010564758442342281,
935
+ 0.010581308044493198,
936
+ 0.010597869753837585,
937
+ 0.010614443570375443,
938
+ 0.010631031356751919,
939
+ 0.01064763218164444,
940
+ 0.01066424511373043,
941
+ 0.01068087201565504,
942
+ 0.010697511024773121,
943
+ 0.01071416400372982,
944
+ 0.01073082908987999,
945
+ 0.010747507214546204,
946
+ 0.010764198377728462,
947
+ 0.010780902579426765,
948
+ 0.010797619819641113,
949
+ 0.010814350098371506,
950
+ 0.010831092484295368,
951
+ 0.01084784884005785,
952
+ 0.010864617303013802,
953
+ 0.010881399735808372,
954
+ 0.010898194275796413,
955
+ 0.010915001854300499,
956
+ 0.010931822471320629,
957
+ 0.010948656126856804,
958
+ 0.010965502820909023,
959
+ 0.010982362553477287,
960
+ 0.010999235324561596,
961
+ 0.011016120202839375,
962
+ 0.011033019050955772,
963
+ 0.01104993000626564,
964
+ 0.011066854931414127,
965
+ 0.011083791963756084,
966
+ 0.011100742034614086,
967
+ 0.011117705143988132,
968
+ 0.011134681291878223,
969
+ 0.011151670478284359,
970
+ 0.01116867270320654,
971
+ 0.011185687966644764,
972
+ 0.011202715337276459,
973
+ 0.011219756677746773,
974
+ 0.011236810125410557,
975
+ 0.011253876611590385,
976
+ 0.011270957067608833,
977
+ 0.011288049630820751,
978
+ 0.011305155232548714,
979
+ 0.01132227387279272,
980
+ 0.011339405551552773,
981
+ 0.011356549337506294,
982
+ 0.011373707093298435,
983
+ 0.01139087788760662,
984
+ 0.011408060789108276,
985
+ 0.011425256729125977,
986
+ 0.011442466638982296,
987
+ 0.011459688656032085,
988
+ 0.01147692371159792,
989
+ 0.011494171805679798,
990
+ 0.011511432938277721,
991
+ 0.01152870710939169,
992
+ 0.011545993387699127,
993
+ 0.011563293635845184,
994
+ 0.011580605991184711,
995
+ 0.011597932316362858,
996
+ 0.011615270748734474,
997
+ 0.01163262315094471,
998
+ 0.011649987660348415,
999
+ 0.011667365208268166,
1000
+ 0.01168475579470396,
1001
+ 0.0117021594196558,
1002
+ 0.01171957515180111,
1003
+ 0.011737004853785038,
1004
+ 0.011754447594285011,
1005
+ 0.011771902441978455,
1006
+ 0.011789370328187943,
1007
+ 0.01180685218423605,
1008
+ 0.011824346147477627,
1009
+ 0.011841853149235249,
1010
+ 0.011859373189508915,
1011
+ 0.011876906268298626,
1012
+ 0.011894452385604382,
1013
+ 0.011912011541426182,
1014
+ 0.011929582804441452,
1015
+ 0.011947168037295341,
1016
+ 0.011964765377342701,
1017
+ 0.01198237668722868,
1018
+ 0.012000000104308128
1019
+ ]
1020
+ }
tokenizer/special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer/tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "mask_token": "[MASK]",
48
+ "model_max_length": 512,
49
+ "pad_token": "[PAD]",
50
+ "sep_token": "[SEP]",
51
+ "strip_accents": null,
52
+ "tokenize_chinese_chars": true,
53
+ "tokenizer_class": "BertTokenizer",
54
+ "unk_token": "[UNK]"
55
+ }
tokenizer/vocab.txt ADDED
The diff for this file is too large to render. See raw diff
 
unet/config.json ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "UNet2DConditionModel",
3
+ "_diffusers_version": "0.27.0.dev0",
4
+ "act_fn": "silu",
5
+ "addition_embed_type": null,
6
+ "addition_embed_type_num_heads": 64,
7
+ "addition_time_embed_dim": null,
8
+ "attention_head_dim": [
9
+ 6,
10
+ 12,
11
+ 18,
12
+ 30
13
+ ],
14
+ "attention_type": "default",
15
+ "block_out_channels": [
16
+ 192,
17
+ 384,
18
+ 576,
19
+ 960
20
+ ],
21
+ "center_input_sample": false,
22
+ "class_embed_type": null,
23
+ "class_embeddings_concat": false,
24
+ "conv_in_kernel": 3,
25
+ "conv_out_kernel": 3,
26
+ "cross_attention_dim": 768,
27
+ "cross_attention_norm": null,
28
+ "down_block_types": [
29
+ "DownBlock2D",
30
+ "CrossAttnDownBlock2D",
31
+ "CrossAttnDownBlock2D",
32
+ "CrossAttnDownBlock2D"
33
+ ],
34
+ "downsample_padding": 1,
35
+ "dropout": 0.0,
36
+ "dual_cross_attention": false,
37
+ "encoder_hid_dim": null,
38
+ "encoder_hid_dim_type": null,
39
+ "flip_sin_to_cos": true,
40
+ "freq_shift": 0,
41
+ "in_channels": 3,
42
+ "layers_per_block": 2,
43
+ "mid_block_only_cross_attention": null,
44
+ "mid_block_scale_factor": 1,
45
+ "mid_block_type": "UNetMidBlock2DCrossAttn",
46
+ "norm_eps": 1e-05,
47
+ "norm_num_groups": 32,
48
+ "num_attention_heads": null,
49
+ "num_class_embeds": null,
50
+ "only_cross_attention": false,
51
+ "out_channels": 3,
52
+ "projection_class_embeddings_input_dim": null,
53
+ "resnet_out_scale_factor": 1.0,
54
+ "resnet_skip_time_act": false,
55
+ "resnet_time_scale_shift": "default",
56
+ "reverse_transformer_layers_per_block": null,
57
+ "sample_size": 64,
58
+ "time_cond_proj_dim": null,
59
+ "time_embedding_act_fn": null,
60
+ "time_embedding_dim": null,
61
+ "time_embedding_type": "positional",
62
+ "timestep_post_act": null,
63
+ "transformer_layers_per_block": 1,
64
+ "up_block_types": [
65
+ "CrossAttnUpBlock2D",
66
+ "CrossAttnUpBlock2D",
67
+ "CrossAttnUpBlock2D",
68
+ "UpBlock2D"
69
+ ],
70
+ "upcast_attention": false,
71
+ "use_linear_projection": false
72
+ }
unet/diffusion_pytorch_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:585b29b19df7067708a1ef427e18f556932bec1476ed3090c94aab132a31e598
3
+ size 1625392420
vqvae/config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "VQModel",
3
+ "_diffusers_version": "0.27.0.dev0",
4
+ "act_fn": "silu",
5
+ "block_out_channels": [
6
+ 128,
7
+ 256,
8
+ 512
9
+ ],
10
+ "down_block_types": [
11
+ "DownEncoderBlock2D",
12
+ "DownEncoderBlock2D",
13
+ "DownEncoderBlock2D"
14
+ ],
15
+ "force_upcast": false,
16
+ "in_channels": 3,
17
+ "latent_channels": 3,
18
+ "layers_per_block": 2,
19
+ "lookup_from_codebook": false,
20
+ "mid_block_add_attention": true,
21
+ "norm_num_groups": 32,
22
+ "norm_type": "group",
23
+ "num_vq_embeddings": 8192,
24
+ "out_channels": 3,
25
+ "sample_size": 64,
26
+ "scaling_factor": 1.0,
27
+ "up_block_types": [
28
+ "UpDecoderBlock2D",
29
+ "UpDecoderBlock2D",
30
+ "UpDecoderBlock2D"
31
+ ],
32
+ "vq_embed_dim": null
33
+ }
vqvae/diffusion_pytorch_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:178b7f07d6d5665795e45941c23e3248a7b9e1b8155efbcb0e56d4985fc62963
3
+ size 221314016