ShadeEngine commited on
Commit
4ac8f3e
·
verified ·
1 Parent(s): 6ef83e0

End of training

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .config/.last_opt_in_prompt.yaml +1 -0
  2. .config/.last_survey_prompt.yaml +1 -0
  3. .config/.last_update_check.json +1 -0
  4. .config/active_config +1 -0
  5. .config/config_sentinel +0 -0
  6. .config/configurations/config_default +6 -0
  7. .config/default_configs.db +0 -0
  8. .config/gce +1 -0
  9. .config/logs/2024.07.22/13.20.29.141190.log +764 -0
  10. .config/logs/2024.07.22/13.20.52.312699.log +5 -0
  11. .config/logs/2024.07.22/13.21.03.941407.log +123 -0
  12. .config/logs/2024.07.22/13.21.05.029456.log +5 -0
  13. .config/logs/2024.07.22/13.21.16.643722.log +8 -0
  14. .config/logs/2024.07.22/13.21.17.317802.log +8 -0
  15. .gitattributes +3 -0
  16. README.md +43 -0
  17. diffusers/.github/ISSUE_TEMPLATE/bug-report.yml +110 -0
  18. diffusers/.github/ISSUE_TEMPLATE/config.yml +4 -0
  19. diffusers/.github/ISSUE_TEMPLATE/feature_request.md +20 -0
  20. diffusers/.github/ISSUE_TEMPLATE/feedback.md +12 -0
  21. diffusers/.github/ISSUE_TEMPLATE/new-model-addition.yml +31 -0
  22. diffusers/.github/ISSUE_TEMPLATE/translate.md +29 -0
  23. diffusers/.github/PULL_REQUEST_TEMPLATE.md +61 -0
  24. diffusers/.github/actions/setup-miniconda/action.yml +146 -0
  25. diffusers/.github/workflows/benchmark.yml +66 -0
  26. diffusers/.github/workflows/build_docker_images.yml +101 -0
  27. diffusers/.github/workflows/build_documentation.yml +27 -0
  28. diffusers/.github/workflows/build_pr_documentation.yml +23 -0
  29. diffusers/.github/workflows/mirror_community_pipeline.yml +102 -0
  30. diffusers/.github/workflows/nightly_tests.yml +414 -0
  31. diffusers/.github/workflows/notify_slack_about_release.yml +23 -0
  32. diffusers/.github/workflows/pr_dependency_test.yml +35 -0
  33. diffusers/.github/workflows/pr_flax_dependency_test.yml +38 -0
  34. diffusers/.github/workflows/pr_test_fetcher.yml +174 -0
  35. diffusers/.github/workflows/pr_test_peft_backend.yml +131 -0
  36. diffusers/.github/workflows/pr_tests.yml +233 -0
  37. diffusers/.github/workflows/pr_torch_dependency_test.yml +36 -0
  38. diffusers/.github/workflows/push_tests.yml +436 -0
  39. diffusers/.github/workflows/push_tests_fast.yml +124 -0
  40. diffusers/.github/workflows/push_tests_mps.yml +75 -0
  41. diffusers/.github/workflows/pypi_publish.yaml +81 -0
  42. diffusers/.github/workflows/run_tests_from_a_pr.yml +73 -0
  43. diffusers/.github/workflows/ssh-pr-runner.yml +39 -0
  44. diffusers/.github/workflows/ssh-runner.yml +46 -0
  45. diffusers/.github/workflows/stale.yml +27 -0
  46. diffusers/.github/workflows/trufflehog.yml +15 -0
  47. diffusers/.github/workflows/typos.yml +14 -0
  48. diffusers/.github/workflows/update_metadata.yml +30 -0
  49. diffusers/.github/workflows/upload_pr_documentation.yml +16 -0
  50. diffusers/.gitignore +178 -0
.config/.last_opt_in_prompt.yaml ADDED
@@ -0,0 +1 @@
 
 
1
+ {}
.config/.last_survey_prompt.yaml ADDED
@@ -0,0 +1 @@
 
 
1
+ last_prompt_time: 1721654463.344073
.config/.last_update_check.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"last_update_check_time": 1721654464.5230906, "last_update_check_revision": 20240712142834, "notifications": [], "last_nag_times": {}}
.config/active_config ADDED
@@ -0,0 +1 @@
 
 
1
+ default
.config/config_sentinel ADDED
File without changes
.config/configurations/config_default ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ [component_manager]
2
+ disable_update_check = true
3
+
4
+ [compute]
5
+ gce_metadata_read_timeout_sec = 0
6
+
.config/default_configs.db ADDED
Binary file (12.3 kB). View file
 
.config/gce ADDED
@@ -0,0 +1 @@
 
 
1
+ False
.config/logs/2024.07.22/13.20.29.141190.log ADDED
@@ -0,0 +1,764 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-07-22 13:20:41,168 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2024-07-22 13:20:41,172 DEBUG root Loaded Command Group: ['gcloud', 'components', 'update']
3
+ 2024-07-22 13:20:41,175 DEBUG root Running [gcloud.components.update] with arguments: [--allow-no-backup: "True", --compile-python: "True", --quiet: "True", COMPONENT-IDS:6: "['core', 'gcloud-deps', 'bq', 'gcloud', 'gcloud-crc32c', 'gsutil']"]
4
+ 2024-07-22 13:20:41,176 INFO ___FILE_ONLY___ Beginning update. This process may take several minutes.
5
+
6
+ 2024-07-22 13:20:41,211 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
7
+ 2024-07-22 13:20:41,286 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components-2.json HTTP/1.1" 200 223375
8
+ 2024-07-22 13:20:41,308 INFO ___FILE_ONLY___
9
+
10
+ 2024-07-22 13:20:41,309 INFO ___FILE_ONLY___
11
+ Your current Google Cloud CLI version is: 484.0.0
12
+
13
+ 2024-07-22 13:20:41,309 INFO ___FILE_ONLY___ Installing components from version: 484.0.0
14
+
15
+ 2024-07-22 13:20:41,309 INFO ___FILE_ONLY___
16
+
17
+ 2024-07-22 13:20:41,309 DEBUG root Chosen display Format:table[box,title="These components will be removed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
18
+ 2024-07-22 13:20:41,310 DEBUG root Chosen display Format:table[box,title="These components will be updated."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
19
+ 2024-07-22 13:20:41,311 DEBUG root Chosen display Format:table[box,title="These components will be installed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
20
+ 2024-07-22 13:20:41,457 INFO ___FILE_ONLY___ ┌─────────────────────────────────────────────────────────────────────────────┐
21
+ 2024-07-22 13:20:41,457 INFO ___FILE_ONLY___
22
+
23
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ │ These components will be installed. │
24
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___
25
+
26
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ ├─────────────────────────────────────────────────────┬────────────┬──────────┤
27
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___
28
+
29
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ │ Name │ Version │ Size │
30
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___
31
+
32
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ ├─────────────────────────────────────────────────────┼────────────┼──────────┤
33
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___
34
+
35
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ │
36
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ BigQuery Command Line Tool
37
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___
38
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ │
39
+ 2024-07-22 13:20:41,458 INFO ___FILE_ONLY___ 2.1.7
40
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___
41
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ │
42
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ 1.7 MiB
43
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___
44
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ │
45
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___
46
+
47
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ │
48
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ BigQuery Command Line Tool (Platform Specific)
49
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___
50
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ │
51
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ 2.0.101
52
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___
53
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ │
54
+ 2024-07-22 13:20:41,459 INFO ___FILE_ONLY___ < 1 MiB
55
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___
56
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ │
57
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___
58
+
59
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ │
60
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ Bundled Python 3.11 (Platform Specific)
61
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___
62
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ │
63
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ 3.11.8
64
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___
65
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ │
66
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ 74.0 MiB
67
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___
68
+ 2024-07-22 13:20:41,460 INFO ___FILE_ONLY___ │
69
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___
70
+
71
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ │
72
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ Cloud Storage Command Line Tool
73
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___
74
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ │
75
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ 5.30
76
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___
77
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ │
78
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ 11.3 MiB
79
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___
80
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ │
81
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___
82
+
83
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ │
84
+ 2024-07-22 13:20:41,461 INFO ___FILE_ONLY___ Cloud Storage Command Line Tool (Platform Specific)
85
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___
86
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ │
87
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ 5.27
88
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___
89
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ │
90
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ < 1 MiB
91
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___
92
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ │
93
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___
94
+
95
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ │
96
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ Google Cloud CLI Core Libraries (Platform Specific)
97
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___
98
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ │
99
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___ 2024.01.06
100
+ 2024-07-22 13:20:41,462 INFO ___FILE_ONLY___
101
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ │
102
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ < 1 MiB
103
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___
104
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ │
105
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___
106
+
107
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ │
108
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ Google Cloud CRC32C Hash Tool (Platform Specific)
109
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___
110
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ │
111
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ 1.0.0
112
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___
113
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ │
114
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___ 1.3 MiB
115
+ 2024-07-22 13:20:41,463 INFO ___FILE_ONLY___
116
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ │
117
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___
118
+
119
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ │
120
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ gcloud cli dependencies (Platform Specific)
121
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___
122
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ │
123
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ 2021.04.16
124
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___
125
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ │
126
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ < 1 MiB
127
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___
128
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ │
129
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___
130
+
131
+ 2024-07-22 13:20:41,464 INFO ___FILE_ONLY___ └─────────────────────────────────────────────────────┴────────────┴──────────┘
132
+ 2024-07-22 13:20:41,465 INFO ___FILE_ONLY___
133
+
134
+ 2024-07-22 13:20:41,465 INFO ___FILE_ONLY___
135
+
136
+ 2024-07-22 13:20:41,469 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
137
+ 2024-07-22 13:20:41,546 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/RELEASE_NOTES HTTP/1.1" 200 1243506
138
+ 2024-07-22 13:20:41,669 INFO ___FILE_ONLY___ For the latest full release notes, please visit:
139
+ https://cloud.google.com/sdk/release_notes
140
+
141
+
142
+ 2024-07-22 13:20:41,669 INFO ___FILE_ONLY___ Performing in place update...
143
+
144
+
145
+ 2024-07-22 13:20:41,671 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
146
+
147
+ 2024-07-22 13:20:41,671 INFO ___FILE_ONLY___ ╠═ Downloading: BigQuery Command Line Tool ═╣
148
+
149
+ 2024-07-22 13:20:41,672 INFO ___FILE_ONLY___ ╚
150
+ 2024-07-22 13:20:41,677 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
151
+ 2024-07-22 13:20:41,755 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-bq-20240712142834.tar.gz HTTP/1.1" 200 1808321
152
+ 2024-07-22 13:20:41,831 INFO ___FILE_ONLY___ ═
153
+ 2024-07-22 13:20:41,832 INFO ___FILE_ONLY___ ═
154
+ 2024-07-22 13:20:41,832 INFO ___FILE_ONLY___ ═
155
+ 2024-07-22 13:20:41,832 INFO ___FILE_ONLY___ ═
156
+ 2024-07-22 13:20:41,832 INFO ___FILE_ONLY___ ═
157
+ 2024-07-22 13:20:41,832 INFO ___FILE_ONLY___ ═
158
+ 2024-07-22 13:20:41,832 INFO ___FILE_ONLY___ ═
159
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
160
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
161
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
162
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
163
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
164
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
165
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
166
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
167
+ 2024-07-22 13:20:41,833 INFO ___FILE_ONLY___ ═
168
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
169
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
170
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
171
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
172
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
173
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
174
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
175
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
176
+ 2024-07-22 13:20:41,834 INFO ___FILE_ONLY___ ═
177
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
178
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
179
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
180
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
181
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
182
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
183
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
184
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
185
+ 2024-07-22 13:20:41,835 INFO ___FILE_ONLY___ ═
186
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
187
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
188
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
189
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
190
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
191
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
192
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
193
+ 2024-07-22 13:20:41,836 INFO ___FILE_ONLY___ ═
194
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
195
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
196
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
197
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
198
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
199
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
200
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
201
+ 2024-07-22 13:20:41,837 INFO ___FILE_ONLY___ ═
202
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
203
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
204
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
205
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
206
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
207
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
208
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
209
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
210
+ 2024-07-22 13:20:41,838 INFO ___FILE_ONLY___ ═
211
+ 2024-07-22 13:20:41,839 INFO ___FILE_ONLY___ ═
212
+ 2024-07-22 13:20:41,839 INFO ___FILE_ONLY___ ╝
213
+
214
+ 2024-07-22 13:20:41,841 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
215
+
216
+ 2024-07-22 13:20:41,841 INFO ___FILE_ONLY___ ╠═ Downloading: BigQuery Command Line Tool (Platform Spe... ═╣
217
+
218
+ 2024-07-22 13:20:41,841 INFO ___FILE_ONLY___ ╚
219
+ 2024-07-22 13:20:41,845 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
220
+ 2024-07-22 13:20:41,920 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-bq-nix-20240106004423.tar.gz HTTP/1.1" 200 2026
221
+ 2024-07-22 13:20:41,920 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
222
+ 2024-07-22 13:20:41,921 INFO ___FILE_ONLY___ ╝
223
+
224
+ 2024-07-22 13:20:41,923 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
225
+
226
+ 2024-07-22 13:20:41,923 INFO ___FILE_ONLY___ ╠═ Downloading: Bundled Python 3.11 ═╣
227
+
228
+ 2024-07-22 13:20:41,923 INFO ___FILE_ONLY___ ╚
229
+ 2024-07-22 13:20:41,923 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
230
+ 2024-07-22 13:20:41,924 INFO ___FILE_ONLY___ ╝
231
+
232
+ 2024-07-22 13:20:41,925 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
233
+
234
+ 2024-07-22 13:20:41,926 INFO ___FILE_ONLY___ ╠═ Downloading: Bundled Python 3.11 (Platform Specific) ═╣
235
+
236
+ 2024-07-22 13:20:41,926 INFO ___FILE_ONLY___ ╚
237
+ 2024-07-22 13:20:41,930 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
238
+ 2024-07-22 13:20:42,009 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-bundled-python3-unix-linux-x86_64-20240712142834.tar.gz HTTP/1.1" 200 77646096
239
+ 2024-07-22 13:20:43,322 INFO ___FILE_ONLY___ ═
240
+ 2024-07-22 13:20:43,324 INFO ___FILE_ONLY___ ═
241
+ 2024-07-22 13:20:43,326 INFO ___FILE_ONLY___ ═
242
+ 2024-07-22 13:20:43,328 INFO ___FILE_ONLY___ ═
243
+ 2024-07-22 13:20:43,329 INFO ___FILE_ONLY___ ═
244
+ 2024-07-22 13:20:43,331 INFO ___FILE_ONLY___ ═
245
+ 2024-07-22 13:20:43,333 INFO ___FILE_ONLY___ ═
246
+ 2024-07-22 13:20:43,334 INFO ___FILE_ONLY___ ═
247
+ 2024-07-22 13:20:43,336 INFO ___FILE_ONLY___ ═
248
+ 2024-07-22 13:20:43,338 INFO ___FILE_ONLY___ ═
249
+ 2024-07-22 13:20:43,340 INFO ___FILE_ONLY___ ═
250
+ 2024-07-22 13:20:43,342 INFO ___FILE_ONLY___ ═
251
+ 2024-07-22 13:20:43,343 INFO ___FILE_ONLY___ ═
252
+ 2024-07-22 13:20:43,345 INFO ___FILE_ONLY___ ═
253
+ 2024-07-22 13:20:43,347 INFO ___FILE_ONLY___ ═
254
+ 2024-07-22 13:20:43,348 INFO ___FILE_ONLY___ ═
255
+ 2024-07-22 13:20:43,350 INFO ___FILE_ONLY___ ═
256
+ 2024-07-22 13:20:43,352 INFO ___FILE_ONLY___ ═
257
+ 2024-07-22 13:20:43,354 INFO ___FILE_ONLY___ ═
258
+ 2024-07-22 13:20:43,355 INFO ___FILE_ONLY___ ═
259
+ 2024-07-22 13:20:43,357 INFO ___FILE_ONLY___ ═
260
+ 2024-07-22 13:20:43,359 INFO ___FILE_ONLY___ ═
261
+ 2024-07-22 13:20:43,360 INFO ___FILE_ONLY___ ═
262
+ 2024-07-22 13:20:43,362 INFO ___FILE_ONLY___ ═
263
+ 2024-07-22 13:20:43,364 INFO ___FILE_ONLY___ ═
264
+ 2024-07-22 13:20:43,365 INFO ___FILE_ONLY___ ═
265
+ 2024-07-22 13:20:43,367 INFO ___FILE_ONLY___ ═
266
+ 2024-07-22 13:20:43,369 INFO ___FILE_ONLY___ ═
267
+ 2024-07-22 13:20:43,371 INFO ___FILE_ONLY___ ═
268
+ 2024-07-22 13:20:43,372 INFO ___FILE_ONLY___ ═
269
+ 2024-07-22 13:20:43,374 INFO ___FILE_ONLY___ ═
270
+ 2024-07-22 13:20:43,376 INFO ___FILE_ONLY___ ═
271
+ 2024-07-22 13:20:43,377 INFO ___FILE_ONLY___ ═
272
+ 2024-07-22 13:20:43,379 INFO ___FILE_ONLY___ ═
273
+ 2024-07-22 13:20:43,381 INFO ___FILE_ONLY___ ═
274
+ 2024-07-22 13:20:43,382 INFO ___FILE_ONLY___ ═
275
+ 2024-07-22 13:20:43,384 INFO ___FILE_ONLY___ ═
276
+ 2024-07-22 13:20:43,386 INFO ___FILE_ONLY___ ═
277
+ 2024-07-22 13:20:43,388 INFO ___FILE_ONLY___ ═
278
+ 2024-07-22 13:20:43,389 INFO ___FILE_ONLY___ ═
279
+ 2024-07-22 13:20:43,391 INFO ___FILE_ONLY___ ═
280
+ 2024-07-22 13:20:43,393 INFO ___FILE_ONLY___ ═
281
+ 2024-07-22 13:20:43,394 INFO ___FILE_ONLY___ ═
282
+ 2024-07-22 13:20:43,396 INFO ___FILE_ONLY___ ═
283
+ 2024-07-22 13:20:43,398 INFO ___FILE_ONLY___ ═
284
+ 2024-07-22 13:20:43,399 INFO ___FILE_ONLY___ ═
285
+ 2024-07-22 13:20:43,401 INFO ___FILE_ONLY___ ═
286
+ 2024-07-22 13:20:43,403 INFO ___FILE_ONLY___ ═
287
+ 2024-07-22 13:20:43,405 INFO ___FILE_ONLY___ ═
288
+ 2024-07-22 13:20:43,406 INFO ___FILE_ONLY___ ═
289
+ 2024-07-22 13:20:43,408 INFO ___FILE_ONLY___ ═
290
+ 2024-07-22 13:20:43,410 INFO ___FILE_ONLY___ ═
291
+ 2024-07-22 13:20:43,411 INFO ___FILE_ONLY___ ═
292
+ 2024-07-22 13:20:43,413 INFO ___FILE_ONLY___ ═
293
+ 2024-07-22 13:20:43,415 INFO ___FILE_ONLY___ ═
294
+ 2024-07-22 13:20:43,417 INFO ___FILE_ONLY___ ═
295
+ 2024-07-22 13:20:43,418 INFO ___FILE_ONLY___ ═
296
+ 2024-07-22 13:20:43,420 INFO ___FILE_ONLY___ ═
297
+ 2024-07-22 13:20:43,422 INFO ___FILE_ONLY___ ═
298
+ 2024-07-22 13:20:43,424 INFO ___FILE_ONLY___ ═
299
+ 2024-07-22 13:20:43,424 INFO ___FILE_ONLY___ ╝
300
+
301
+ 2024-07-22 13:20:43,427 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
302
+
303
+ 2024-07-22 13:20:43,427 INFO ___FILE_ONLY___ ╠═ Downloading: Cloud Storage Command Line Tool ═╣
304
+
305
+ 2024-07-22 13:20:43,427 INFO ___FILE_ONLY___ ╚
306
+ 2024-07-22 13:20:43,431 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
307
+ 2024-07-22 13:20:43,514 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gsutil-20240614142823.tar.gz HTTP/1.1" 200 11883175
308
+ 2024-07-22 13:20:43,694 INFO ___FILE_ONLY___ ═
309
+ 2024-07-22 13:20:43,694 INFO ___FILE_ONLY___ ═
310
+ 2024-07-22 13:20:43,695 INFO ___FILE_ONLY___ ═
311
+ 2024-07-22 13:20:43,695 INFO ___FILE_ONLY___ ═
312
+ 2024-07-22 13:20:43,695 INFO ___FILE_ONLY___ ═
313
+ 2024-07-22 13:20:43,696 INFO ___FILE_ONLY___ ═
314
+ 2024-07-22 13:20:43,696 INFO ___FILE_ONLY___ ═
315
+ 2024-07-22 13:20:43,696 INFO ___FILE_ONLY___ ═
316
+ 2024-07-22 13:20:43,697 INFO ___FILE_ONLY___ ═
317
+ 2024-07-22 13:20:43,697 INFO ___FILE_ONLY___ ═
318
+ 2024-07-22 13:20:43,697 INFO ___FILE_ONLY___ ═
319
+ 2024-07-22 13:20:43,698 INFO ___FILE_ONLY___ ═
320
+ 2024-07-22 13:20:43,698 INFO ___FILE_ONLY___ ═
321
+ 2024-07-22 13:20:43,698 INFO ___FILE_ONLY___ ═
322
+ 2024-07-22 13:20:43,699 INFO ___FILE_ONLY___ ═
323
+ 2024-07-22 13:20:43,699 INFO ___FILE_ONLY___ ═
324
+ 2024-07-22 13:20:43,699 INFO ___FILE_ONLY___ ═
325
+ 2024-07-22 13:20:43,700 INFO ___FILE_ONLY___ ═
326
+ 2024-07-22 13:20:43,700 INFO ___FILE_ONLY___ ═
327
+ 2024-07-22 13:20:43,700 INFO ___FILE_ONLY___ ═
328
+ 2024-07-22 13:20:43,701 INFO ___FILE_ONLY___ ═
329
+ 2024-07-22 13:20:43,701 INFO ___FILE_ONLY___ ═
330
+ 2024-07-22 13:20:43,701 INFO ___FILE_ONLY___ ═
331
+ 2024-07-22 13:20:43,702 INFO ___FILE_ONLY___ ═
332
+ 2024-07-22 13:20:43,702 INFO ___FILE_ONLY___ ═
333
+ 2024-07-22 13:20:43,702 INFO ___FILE_ONLY___ ═
334
+ 2024-07-22 13:20:43,703 INFO ___FILE_ONLY___ ═
335
+ 2024-07-22 13:20:43,703 INFO ___FILE_ONLY___ ═
336
+ 2024-07-22 13:20:43,703 INFO ___FILE_ONLY___ ═
337
+ 2024-07-22 13:20:43,704 INFO ___FILE_ONLY___ ═
338
+ 2024-07-22 13:20:43,704 INFO ___FILE_ONLY___ ═
339
+ 2024-07-22 13:20:43,704 INFO ___FILE_ONLY___ ═
340
+ 2024-07-22 13:20:43,705 INFO ___FILE_ONLY___ ═
341
+ 2024-07-22 13:20:43,705 INFO ___FILE_ONLY___ ═
342
+ 2024-07-22 13:20:43,705 INFO ___FILE_ONLY___ ═
343
+ 2024-07-22 13:20:43,706 INFO ___FILE_ONLY___ ═
344
+ 2024-07-22 13:20:43,706 INFO ___FILE_ONLY___ ═
345
+ 2024-07-22 13:20:43,706 INFO ___FILE_ONLY___ ═
346
+ 2024-07-22 13:20:43,707 INFO ___FILE_ONLY___ ═
347
+ 2024-07-22 13:20:43,707 INFO ___FILE_ONLY___ ═
348
+ 2024-07-22 13:20:43,707 INFO ___FILE_ONLY___ ═
349
+ 2024-07-22 13:20:43,708 INFO ___FILE_ONLY___ ═
350
+ 2024-07-22 13:20:43,708 INFO ___FILE_ONLY___ ═
351
+ 2024-07-22 13:20:43,708 INFO ___FILE_ONLY___ ═
352
+ 2024-07-22 13:20:43,709 INFO ___FILE_ONLY___ ═
353
+ 2024-07-22 13:20:43,709 INFO ___FILE_ONLY___ ═
354
+ 2024-07-22 13:20:43,709 INFO ___FILE_ONLY___ ═
355
+ 2024-07-22 13:20:43,710 INFO ___FILE_ONLY___ ═
356
+ 2024-07-22 13:20:43,710 INFO ___FILE_ONLY___ ═
357
+ 2024-07-22 13:20:43,711 INFO ___FILE_ONLY___ ═
358
+ 2024-07-22 13:20:43,711 INFO ___FILE_ONLY___ ═
359
+ 2024-07-22 13:20:43,711 INFO ___FILE_ONLY___ ═
360
+ 2024-07-22 13:20:43,712 INFO ___FILE_ONLY___ ═
361
+ 2024-07-22 13:20:43,712 INFO ___FILE_ONLY___ ═
362
+ 2024-07-22 13:20:43,712 INFO ___FILE_ONLY___ ═
363
+ 2024-07-22 13:20:43,713 INFO ___FILE_ONLY___ ═
364
+ 2024-07-22 13:20:43,713 INFO ___FILE_ONLY___ ═
365
+ 2024-07-22 13:20:43,713 INFO ___FILE_ONLY___ ═
366
+ 2024-07-22 13:20:43,714 INFO ___FILE_ONLY___ ═
367
+ 2024-07-22 13:20:43,714 INFO ___FILE_ONLY___ ═
368
+ 2024-07-22 13:20:43,714 INFO ___FILE_ONLY___ ╝
369
+
370
+ 2024-07-22 13:20:43,717 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
371
+
372
+ 2024-07-22 13:20:43,717 INFO ___FILE_ONLY___ ╠═ Downloading: Cloud Storage Command Line Tool (Platfor... ═╣
373
+
374
+ 2024-07-22 13:20:43,717 INFO ___FILE_ONLY___ ╚
375
+ 2024-07-22 13:20:43,721 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
376
+ 2024-07-22 13:20:43,853 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gsutil-nix-20240106004423.tar.gz HTTP/1.1" 200 2042
377
+ 2024-07-22 13:20:43,854 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
378
+ 2024-07-22 13:20:43,854 INFO ___FILE_ONLY___ ╝
379
+
380
+ 2024-07-22 13:20:43,856 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
381
+
382
+ 2024-07-22 13:20:43,856 INFO ___FILE_ONLY___ ╠═ Downloading: Default set of gcloud commands ═╣
383
+
384
+ 2024-07-22 13:20:43,856 INFO ___FILE_ONLY___ ╚
385
+ 2024-07-22 13:20:43,857 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
386
+ 2024-07-22 13:20:43,857 INFO ___FILE_ONLY___ ╝
387
+
388
+ 2024-07-22 13:20:43,859 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
389
+
390
+ 2024-07-22 13:20:43,859 INFO ___FILE_ONLY___ ╠═ Downloading: Google Cloud CLI Core Libraries (Platfor... ═╣
391
+
392
+ 2024-07-22 13:20:43,859 INFO ___FILE_ONLY___ ╚
393
+ 2024-07-22 13:20:43,863 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
394
+ 2024-07-22 13:20:43,936 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-core-nix-20240106004423.tar.gz HTTP/1.1" 200 2410
395
+ 2024-07-22 13:20:43,937 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
396
+ 2024-07-22 13:20:43,937 INFO ___FILE_ONLY___ ╝
397
+
398
+ 2024-07-22 13:20:43,939 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
399
+
400
+ 2024-07-22 13:20:43,940 INFO ___FILE_ONLY___ ╠═ Downloading: Google Cloud CRC32C Hash Tool ═╣
401
+
402
+ 2024-07-22 13:20:43,940 INFO ___FILE_ONLY___ ╚
403
+ 2024-07-22 13:20:43,940 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
404
+ 2024-07-22 13:20:43,940 INFO ___FILE_ONLY___ ╝
405
+
406
+ 2024-07-22 13:20:43,942 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
407
+
408
+ 2024-07-22 13:20:43,942 INFO ___FILE_ONLY___ ╠═ Downloading: Google Cloud CRC32C Hash Tool (Platform ... ═╣
409
+
410
+ 2024-07-22 13:20:43,942 INFO ___FILE_ONLY___ ╚
411
+ 2024-07-22 13:20:43,952 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
412
+ 2024-07-22 13:20:44,031 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gcloud-crc32c-linux-x86_64-20240712142834.tar.gz HTTP/1.1" 200 1350263
413
+ 2024-07-22 13:20:44,103 INFO ___FILE_ONLY___ ═
414
+ 2024-07-22 13:20:44,103 INFO ___FILE_ONLY___ ═
415
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
416
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
417
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
418
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
419
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
420
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
421
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
422
+ 2024-07-22 13:20:44,104 INFO ___FILE_ONLY___ ═
423
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
424
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
425
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
426
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
427
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
428
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
429
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
430
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
431
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
432
+ 2024-07-22 13:20:44,105 INFO ___FILE_ONLY___ ═
433
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
434
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
435
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
436
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
437
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
438
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
439
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
440
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
441
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
442
+ 2024-07-22 13:20:44,106 INFO ___FILE_ONLY___ ═
443
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
444
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
445
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
446
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
447
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
448
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
449
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
450
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
451
+ 2024-07-22 13:20:44,107 INFO ___FILE_ONLY___ ═
452
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
453
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
454
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
455
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
456
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
457
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
458
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
459
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
460
+ 2024-07-22 13:20:44,108 INFO ___FILE_ONLY___ ═
461
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
462
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
463
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
464
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
465
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
466
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
467
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
468
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
469
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
470
+ 2024-07-22 13:20:44,109 INFO ___FILE_ONLY___ ═
471
+ 2024-07-22 13:20:44,110 INFO ___FILE_ONLY___ ═
472
+ 2024-07-22 13:20:44,110 INFO ___FILE_ONLY___ ═
473
+ 2024-07-22 13:20:44,110 INFO ___FILE_ONLY___ ╝
474
+
475
+ 2024-07-22 13:20:44,112 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
476
+
477
+ 2024-07-22 13:20:44,112 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud cli dependencies (Platform Specific) ═╣
478
+
479
+ 2024-07-22 13:20:44,112 INFO ___FILE_ONLY___ ╚
480
+ 2024-07-22 13:20:44,116 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
481
+ 2024-07-22 13:20:44,246 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gcloud-deps-linux-x86_64-20210416153011.tar.gz HTTP/1.1" 200 104
482
+ 2024-07-22 13:20:44,247 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
483
+ 2024-07-22 13:20:44,247 INFO ___FILE_ONLY___ ╝
484
+
485
+ 2024-07-22 13:20:44,249 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
486
+
487
+ 2024-07-22 13:20:44,249 INFO ___FILE_ONLY___ ╠═ Installing: BigQuery Command Line Tool ═╣
488
+
489
+ 2024-07-22 13:20:44,250 INFO ___FILE_ONLY___ ╚
490
+ 2024-07-22 13:20:44,391 INFO ___FILE_ONLY___ ═
491
+ 2024-07-22 13:20:44,394 INFO ___FILE_ONLY___ ═
492
+ 2024-07-22 13:20:44,397 INFO ___FILE_ONLY___ ═
493
+ 2024-07-22 13:20:44,400 INFO ___FILE_ONLY___ ═
494
+ 2024-07-22 13:20:44,403 INFO ___FILE_ONLY___ ═
495
+ 2024-07-22 13:20:44,405 INFO ___FILE_ONLY___ ═
496
+ 2024-07-22 13:20:44,408 INFO ___FILE_ONLY___ ═
497
+ 2024-07-22 13:20:44,410 INFO ___FILE_ONLY___ ═
498
+ 2024-07-22 13:20:44,413 INFO ___FILE_ONLY___ ═
499
+ 2024-07-22 13:20:44,415 INFO ___FILE_ONLY___ ═
500
+ 2024-07-22 13:20:44,418 INFO ___FILE_ONLY___ ═
501
+ 2024-07-22 13:20:44,421 INFO ___FILE_ONLY___ ═
502
+ 2024-07-22 13:20:44,423 INFO ___FILE_ONLY___ ═
503
+ 2024-07-22 13:20:44,426 INFO ___FILE_ONLY___ ═
504
+ 2024-07-22 13:20:44,429 INFO ___FILE_ONLY___ ═
505
+ 2024-07-22 13:20:44,431 INFO ___FILE_ONLY___ ═
506
+ 2024-07-22 13:20:44,434 INFO ___FILE_ONLY___ ═
507
+ 2024-07-22 13:20:44,437 INFO ___FILE_ONLY___ ═
508
+ 2024-07-22 13:20:44,440 INFO ___FILE_ONLY___ ═
509
+ 2024-07-22 13:20:44,442 INFO ___FILE_ONLY___ ═
510
+ 2024-07-22 13:20:44,445 INFO ___FILE_ONLY___ ═
511
+ 2024-07-22 13:20:44,447 INFO ___FILE_ONLY___ ═
512
+ 2024-07-22 13:20:44,449 INFO ___FILE_ONLY___ ═
513
+ 2024-07-22 13:20:44,451 INFO ___FILE_ONLY___ ═
514
+ 2024-07-22 13:20:44,454 INFO ___FILE_ONLY___ ═
515
+ 2024-07-22 13:20:44,456 INFO ___FILE_ONLY___ ═
516
+ 2024-07-22 13:20:44,458 INFO ___FILE_ONLY___ ═
517
+ 2024-07-22 13:20:44,463 INFO ___FILE_ONLY___ ═
518
+ 2024-07-22 13:20:44,465 INFO ___FILE_ONLY___ ═
519
+ 2024-07-22 13:20:44,467 INFO ___FILE_ONLY___ ═
520
+ 2024-07-22 13:20:44,470 INFO ___FILE_ONLY___ ═
521
+ 2024-07-22 13:20:44,472 INFO ___FILE_ONLY___ ═
522
+ 2024-07-22 13:20:44,475 INFO ___FILE_ONLY___ ═
523
+ 2024-07-22 13:20:44,480 INFO ___FILE_ONLY___ ═
524
+ 2024-07-22 13:20:44,489 INFO ___FILE_ONLY___ ═
525
+ 2024-07-22 13:20:44,491 INFO ___FILE_ONLY___ ═
526
+ 2024-07-22 13:20:44,493 INFO ___FILE_ONLY___ ═
527
+ 2024-07-22 13:20:44,498 INFO ___FILE_ONLY___ ═
528
+ 2024-07-22 13:20:44,501 INFO ___FILE_ONLY___ ═
529
+ 2024-07-22 13:20:44,504 INFO ___FILE_ONLY___ ═
530
+ 2024-07-22 13:20:44,506 INFO ___FILE_ONLY___ ═
531
+ 2024-07-22 13:20:44,510 INFO ___FILE_ONLY___ ═
532
+ 2024-07-22 13:20:44,513 INFO ___FILE_ONLY___ ═
533
+ 2024-07-22 13:20:44,516 INFO ___FILE_ONLY___ ═
534
+ 2024-07-22 13:20:44,518 INFO ___FILE_ONLY___ ═
535
+ 2024-07-22 13:20:44,520 INFO ___FILE_ONLY___ ═
536
+ 2024-07-22 13:20:44,523 INFO ___FILE_ONLY___ ═
537
+ 2024-07-22 13:20:44,525 INFO ___FILE_ONLY___ ═
538
+ 2024-07-22 13:20:44,528 INFO ___FILE_ONLY___ ═
539
+ 2024-07-22 13:20:44,530 INFO ___FILE_ONLY___ ═
540
+ 2024-07-22 13:20:44,532 INFO ___FILE_ONLY___ ═
541
+ 2024-07-22 13:20:44,535 INFO ___FILE_ONLY___ ═
542
+ 2024-07-22 13:20:44,537 INFO ___FILE_ONLY___ ═
543
+ 2024-07-22 13:20:44,539 INFO ___FILE_ONLY___ ═
544
+ 2024-07-22 13:20:44,542 INFO ___FILE_ONLY___ ═
545
+ 2024-07-22 13:20:44,544 INFO ___FILE_ONLY___ ═
546
+ 2024-07-22 13:20:44,546 INFO ___FILE_ONLY___ ═
547
+ 2024-07-22 13:20:44,548 INFO ___FILE_ONLY___ ═
548
+ 2024-07-22 13:20:44,551 INFO ___FILE_ONLY___ ═
549
+ 2024-07-22 13:20:44,553 INFO ___FILE_ONLY___ ═
550
+ 2024-07-22 13:20:44,554 INFO ___FILE_ONLY___ ╝
551
+
552
+ 2024-07-22 13:20:44,570 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
553
+
554
+ 2024-07-22 13:20:44,570 INFO ___FILE_ONLY___ ╠═ Installing: BigQuery Command Line Tool (Platform Spec... ═╣
555
+
556
+ 2024-07-22 13:20:44,571 INFO ___FILE_ONLY___ ╚
557
+ 2024-07-22 13:20:44,572 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
558
+ 2024-07-22 13:20:44,572 INFO ___FILE_ONLY___ ╝
559
+
560
+ 2024-07-22 13:20:44,580 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
561
+
562
+ 2024-07-22 13:20:44,580 INFO ___FILE_ONLY___ ╠═ Installing: Bundled Python 3.11 ═╣
563
+
564
+ 2024-07-22 13:20:44,580 INFO ___FILE_ONLY___ ╚
565
+ 2024-07-22 13:20:44,586 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
566
+ 2024-07-22 13:20:44,586 INFO ___FILE_ONLY___ ╝
567
+
568
+ 2024-07-22 13:20:44,588 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
569
+
570
+ 2024-07-22 13:20:44,588 INFO ___FILE_ONLY___ ╠═ Installing: Bundled Python 3.11 (Platform Specific) ═╣
571
+
572
+ 2024-07-22 13:20:44,588 INFO ___FILE_ONLY___ ╚
573
+ 2024-07-22 13:20:46,996 INFO ___FILE_ONLY___ ═
574
+ 2024-07-22 13:20:47,011 INFO ___FILE_ONLY___ ═
575
+ 2024-07-22 13:20:47,025 INFO ___FILE_ONLY___ ═
576
+ 2024-07-22 13:20:47,039 INFO ___FILE_ONLY___ ═
577
+ 2024-07-22 13:20:47,054 INFO ___FILE_ONLY___ ═
578
+ 2024-07-22 13:20:47,069 INFO ___FILE_ONLY___ ═
579
+ 2024-07-22 13:20:47,083 INFO ___FILE_ONLY___ ═
580
+ 2024-07-22 13:20:47,098 INFO ___FILE_ONLY___ ═
581
+ 2024-07-22 13:20:47,113 INFO ___FILE_ONLY___ ═
582
+ 2024-07-22 13:20:47,127 INFO ___FILE_ONLY___ ═
583
+ 2024-07-22 13:20:47,142 INFO ___FILE_ONLY___ ═
584
+ 2024-07-22 13:20:47,156 INFO ___FILE_ONLY___ ═
585
+ 2024-07-22 13:20:47,171 INFO ___FILE_ONLY___ ═
586
+ 2024-07-22 13:20:47,185 INFO ___FILE_ONLY___ ═
587
+ 2024-07-22 13:20:47,199 INFO ___FILE_ONLY___ ═
588
+ 2024-07-22 13:20:47,214 INFO ___FILE_ONLY___ ═
589
+ 2024-07-22 13:20:47,228 INFO ___FILE_ONLY___ ═
590
+ 2024-07-22 13:20:47,243 INFO ___FILE_ONLY___ ═
591
+ 2024-07-22 13:20:47,257 INFO ___FILE_ONLY___ ═
592
+ 2024-07-22 13:20:47,271 INFO ___FILE_ONLY___ ═
593
+ 2024-07-22 13:20:47,285 INFO ___FILE_ONLY___ ═
594
+ 2024-07-22 13:20:47,300 INFO ___FILE_ONLY___ ═
595
+ 2024-07-22 13:20:47,315 INFO ___FILE_ONLY___ ═
596
+ 2024-07-22 13:20:47,329 INFO ___FILE_ONLY___ ═
597
+ 2024-07-22 13:20:47,344 INFO ___FILE_ONLY___ ═
598
+ 2024-07-22 13:20:47,360 INFO ___FILE_ONLY___ ═
599
+ 2024-07-22 13:20:47,374 INFO ___FILE_ONLY___ ═
600
+ 2024-07-22 13:20:47,389 INFO ___FILE_ONLY___ ═
601
+ 2024-07-22 13:20:47,404 INFO ___FILE_ONLY___ ═
602
+ 2024-07-22 13:20:47,419 INFO ___FILE_ONLY___ ═
603
+ 2024-07-22 13:20:47,434 INFO ___FILE_ONLY___ ═
604
+ 2024-07-22 13:20:47,450 INFO ___FILE_ONLY___ ═
605
+ 2024-07-22 13:20:47,465 INFO ___FILE_ONLY___ ═
606
+ 2024-07-22 13:20:47,481 INFO ___FILE_ONLY___ ═
607
+ 2024-07-22 13:20:47,883 INFO ___FILE_ONLY___ ═
608
+ 2024-07-22 13:20:47,911 INFO ___FILE_ONLY___ ═
609
+ 2024-07-22 13:20:47,934 INFO ___FILE_ONLY___ ═
610
+ 2024-07-22 13:20:47,958 INFO ___FILE_ONLY___ ═
611
+ 2024-07-22 13:20:47,982 INFO ___FILE_ONLY___ ═
612
+ 2024-07-22 13:20:48,018 INFO ___FILE_ONLY___ ═
613
+ 2024-07-22 13:20:48,041 INFO ___FILE_ONLY___ ═
614
+ 2024-07-22 13:20:48,063 INFO ___FILE_ONLY___ ═
615
+ 2024-07-22 13:20:48,086 INFO ___FILE_ONLY___ ═
616
+ 2024-07-22 13:20:48,233 INFO ___FILE_ONLY___ ═
617
+ 2024-07-22 13:20:48,251 INFO ___FILE_ONLY___ ═
618
+ 2024-07-22 13:20:48,349 INFO ___FILE_ONLY___ ═
619
+ 2024-07-22 13:20:48,370 INFO ___FILE_ONLY___ ═
620
+ 2024-07-22 13:20:48,393 INFO ___FILE_ONLY___ ═
621
+ 2024-07-22 13:20:48,414 INFO ___FILE_ONLY___ ═
622
+ 2024-07-22 13:20:48,440 INFO ___FILE_ONLY___ ═
623
+ 2024-07-22 13:20:48,467 INFO ___FILE_ONLY___ ═
624
+ 2024-07-22 13:20:48,515 INFO ___FILE_ONLY___ ═
625
+ 2024-07-22 13:20:48,536 INFO ___FILE_ONLY___ ═
626
+ 2024-07-22 13:20:48,556 INFO ___FILE_ONLY___ ═
627
+ 2024-07-22 13:20:48,581 INFO ___FILE_ONLY___ ═
628
+ 2024-07-22 13:20:48,607 INFO ___FILE_ONLY___ ═
629
+ 2024-07-22 13:20:48,639 INFO ___FILE_ONLY___ ═
630
+ 2024-07-22 13:20:49,817 INFO ___FILE_ONLY___ ═
631
+ 2024-07-22 13:20:49,834 INFO ___FILE_ONLY___ ═
632
+ 2024-07-22 13:20:49,852 INFO ___FILE_ONLY___ ═
633
+ 2024-07-22 13:20:49,852 INFO ___FILE_ONLY___ ╝
634
+
635
+ 2024-07-22 13:20:49,967 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
636
+
637
+ 2024-07-22 13:20:49,968 INFO ___FILE_ONLY___ ╠═ Installing: Cloud Storage Command Line Tool ═╣
638
+
639
+ 2024-07-22 13:20:49,968 INFO ___FILE_ONLY___ ╚
640
+ 2024-07-22 13:20:50,761 INFO ___FILE_ONLY___ ═
641
+ 2024-07-22 13:20:50,776 INFO ___FILE_ONLY___ ═
642
+ 2024-07-22 13:20:50,791 INFO ___FILE_ONLY___ ═
643
+ 2024-07-22 13:20:50,812 INFO ___FILE_ONLY___ ═
644
+ 2024-07-22 13:20:50,831 INFO ___FILE_ONLY___ ═
645
+ 2024-07-22 13:20:50,849 INFO ___FILE_ONLY___ ═
646
+ 2024-07-22 13:20:50,864 INFO ___FILE_ONLY___ ═
647
+ 2024-07-22 13:20:50,889 INFO ___FILE_ONLY___ ═
648
+ 2024-07-22 13:20:50,903 INFO ___FILE_ONLY___ ═
649
+ 2024-07-22 13:20:50,915 INFO ___FILE_ONLY___ ═
650
+ 2024-07-22 13:20:50,927 INFO ___FILE_ONLY___ ═
651
+ 2024-07-22 13:20:50,969 INFO ___FILE_ONLY___ ═
652
+ 2024-07-22 13:20:50,981 INFO ___FILE_ONLY___ ═
653
+ 2024-07-22 13:20:50,996 INFO ___FILE_ONLY___ ═
654
+ 2024-07-22 13:20:51,010 INFO ___FILE_ONLY___ ═
655
+ 2024-07-22 13:20:51,020 INFO ___FILE_ONLY___ ═
656
+ 2024-07-22 13:20:51,031 INFO ___FILE_ONLY___ ═
657
+ 2024-07-22 13:20:51,044 INFO ___FILE_ONLY___ ═
658
+ 2024-07-22 13:20:51,054 INFO ___FILE_ONLY___ ═
659
+ 2024-07-22 13:20:51,067 INFO ___FILE_ONLY___ ═
660
+ 2024-07-22 13:20:51,078 INFO ___FILE_ONLY___ ═
661
+ 2024-07-22 13:20:51,089 INFO ___FILE_ONLY___ ═
662
+ 2024-07-22 13:20:51,101 INFO ___FILE_ONLY___ ═
663
+ 2024-07-22 13:20:51,115 INFO ___FILE_ONLY___ ═
664
+ 2024-07-22 13:20:51,126 INFO ___FILE_ONLY___ ═
665
+ 2024-07-22 13:20:51,141 INFO ___FILE_ONLY___ ═
666
+ 2024-07-22 13:20:51,156 INFO ___FILE_ONLY___ ═
667
+ 2024-07-22 13:20:51,170 INFO ___FILE_ONLY___ ═
668
+ 2024-07-22 13:20:51,184 INFO ___FILE_ONLY___ ═
669
+ 2024-07-22 13:20:51,209 INFO ___FILE_ONLY___ ═
670
+ 2024-07-22 13:20:51,221 INFO ___FILE_ONLY___ ═
671
+ 2024-07-22 13:20:51,234 INFO ___FILE_ONLY___ ═
672
+ 2024-07-22 13:20:51,246 INFO ___FILE_ONLY___ ═
673
+ 2024-07-22 13:20:51,263 INFO ___FILE_ONLY___ ═
674
+ 2024-07-22 13:20:51,276 INFO ___FILE_ONLY___ ═
675
+ 2024-07-22 13:20:51,289 INFO ___FILE_ONLY___ ═
676
+ 2024-07-22 13:20:51,305 INFO ___FILE_ONLY___ ═
677
+ 2024-07-22 13:20:51,322 INFO ___FILE_ONLY___ ═
678
+ 2024-07-22 13:20:51,338 INFO ___FILE_ONLY___ ═
679
+ 2024-07-22 13:20:51,357 INFO ___FILE_ONLY___ ═
680
+ 2024-07-22 13:20:51,373 INFO ___FILE_ONLY___ ═
681
+ 2024-07-22 13:20:51,385 INFO ___FILE_ONLY___ ═
682
+ 2024-07-22 13:20:51,397 INFO ___FILE_ONLY___ ═
683
+ 2024-07-22 13:20:51,413 INFO ___FILE_ONLY___ ═
684
+ 2024-07-22 13:20:51,428 INFO ___FILE_ONLY___ ═
685
+ 2024-07-22 13:20:51,444 INFO ___FILE_ONLY___ ═
686
+ 2024-07-22 13:20:51,461 INFO ___FILE_ONLY___ ═
687
+ 2024-07-22 13:20:51,475 INFO ___FILE_ONLY___ ═
688
+ 2024-07-22 13:20:51,491 INFO ___FILE_ONLY___ ═
689
+ 2024-07-22 13:20:51,505 INFO ___FILE_ONLY___ ═
690
+ 2024-07-22 13:20:51,517 INFO ___FILE_ONLY___ ═
691
+ 2024-07-22 13:20:51,529 INFO ___FILE_ONLY___ ═
692
+ 2024-07-22 13:20:51,542 INFO ___FILE_ONLY___ ═
693
+ 2024-07-22 13:20:51,554 INFO ___FILE_ONLY___ ═
694
+ 2024-07-22 13:20:51,565 INFO ___FILE_ONLY___ ═
695
+ 2024-07-22 13:20:51,577 INFO ___FILE_ONLY___ ═
696
+ 2024-07-22 13:20:51,588 INFO ___FILE_ONLY___ ═
697
+ 2024-07-22 13:20:51,611 INFO ___FILE_ONLY___ ═
698
+ 2024-07-22 13:20:51,629 INFO ___FILE_ONLY___ ═
699
+ 2024-07-22 13:20:51,648 INFO ___FILE_ONLY___ ═
700
+ 2024-07-22 13:20:51,648 INFO ___FILE_ONLY___ ╝
701
+
702
+ 2024-07-22 13:20:51,728 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
703
+
704
+ 2024-07-22 13:20:51,728 INFO ___FILE_ONLY___ ╠═ Installing: Cloud Storage Command Line Tool (Platform... ═╣
705
+
706
+ 2024-07-22 13:20:51,728 INFO ___FILE_ONLY___ ╚
707
+ 2024-07-22 13:20:51,729 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
708
+ 2024-07-22 13:20:51,730 INFO ___FILE_ONLY___ ╝
709
+
710
+ 2024-07-22 13:20:51,737 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
711
+
712
+ 2024-07-22 13:20:51,737 INFO ___FILE_ONLY___ ╠═ Installing: Default set of gcloud commands ═╣
713
+
714
+ 2024-07-22 13:20:51,738 INFO ___FILE_ONLY___ ╚
715
+ 2024-07-22 13:20:51,743 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
716
+ 2024-07-22 13:20:51,743 INFO ___FILE_ONLY___ ╝
717
+
718
+ 2024-07-22 13:20:51,745 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
719
+
720
+ 2024-07-22 13:20:51,745 INFO ___FILE_ONLY___ ╠═ Installing: Google Cloud CLI Core Libraries (Platform... ═╣
721
+
722
+ 2024-07-22 13:20:51,745 INFO ___FILE_ONLY___ ╚
723
+ 2024-07-22 13:20:51,746 INFO ___FILE_ONLY___ ══════════════════════════════
724
+ 2024-07-22 13:20:51,747 INFO ___FILE_ONLY___ ══════════════════════════════
725
+ 2024-07-22 13:20:51,747 INFO ___FILE_ONLY___ ╝
726
+
727
+ 2024-07-22 13:20:51,755 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
728
+
729
+ 2024-07-22 13:20:51,755 INFO ___FILE_ONLY___ ╠═ Installing: Google Cloud CRC32C Hash Tool ═╣
730
+
731
+ 2024-07-22 13:20:51,755 INFO ___FILE_ONLY___ ╚
732
+ 2024-07-22 13:20:51,760 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
733
+ 2024-07-22 13:20:51,760 INFO ___FILE_ONLY___ ╝
734
+
735
+ 2024-07-22 13:20:51,762 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
736
+
737
+ 2024-07-22 13:20:51,762 INFO ___FILE_ONLY___ ╠═ Installing: Google Cloud CRC32C Hash Tool (Platform S... ═╣
738
+
739
+ 2024-07-22 13:20:51,762 INFO ___FILE_ONLY___ ╚
740
+ 2024-07-22 13:20:51,799 INFO ___FILE_ONLY___ ══════════════════════════════
741
+ 2024-07-22 13:20:51,800 INFO ___FILE_ONLY___ ══════════════════════════════
742
+ 2024-07-22 13:20:51,800 INFO ___FILE_ONLY___ ╝
743
+
744
+ 2024-07-22 13:20:51,808 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
745
+
746
+ 2024-07-22 13:20:51,808 INFO ___FILE_ONLY___ ╠═ Installing: gcloud cli dependencies (Platform Specific) ═╣
747
+
748
+ 2024-07-22 13:20:51,809 INFO ___FILE_ONLY___ ╚
749
+ 2024-07-22 13:20:51,809 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
750
+ 2024-07-22 13:20:51,809 INFO ___FILE_ONLY___ ╝
751
+
752
+ 2024-07-22 13:20:51,819 DEBUG root Updating notification cache...
753
+ 2024-07-22 13:20:51,819 INFO ___FILE_ONLY___
754
+
755
+ 2024-07-22 13:20:51,821 INFO ___FILE_ONLY___ Performing post processing steps...
756
+ 2024-07-22 13:20:51,821 DEBUG root Executing command: ['/tools/google-cloud-sdk/bin/gcloud', 'components', 'post-process']
757
+ 2024-07-22 13:21:03,233 DEBUG ___FILE_ONLY___
758
+ 2024-07-22 13:21:03,233 DEBUG ___FILE_ONLY___
759
+ 2024-07-22 13:21:03,338 INFO ___FILE_ONLY___
760
+ Update done!
761
+
762
+
763
+ 2024-07-22 13:21:03,342 DEBUG root Chosen display Format:none
764
+ 2024-07-22 13:21:03,343 INFO root Display format: "none"
.config/logs/2024.07.22/13.20.52.312699.log ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ 2024-07-22 13:20:52,313 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2024-07-22 13:20:52,316 DEBUG root Loaded Command Group: ['gcloud', 'components', 'post_process']
3
+ 2024-07-22 13:20:52,318 DEBUG root Running [gcloud.components.post-process] with arguments: []
4
+ 2024-07-22 13:21:03,143 DEBUG root Chosen display Format:none
5
+ 2024-07-22 13:21:03,143 INFO root Display format: "none"
.config/logs/2024.07.22/13.21.03.941407.log ADDED
@@ -0,0 +1,123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-07-22 13:21:03,942 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2024-07-22 13:21:03,944 DEBUG root Loaded Command Group: ['gcloud', 'components', 'update']
3
+ 2024-07-22 13:21:03,947 DEBUG root Running [gcloud.components.update] with arguments: [--quiet: "True", COMPONENT-IDS:8: "['gcloud', 'core', 'bq', 'gsutil', 'compute', 'preview', 'alpha', 'beta']"]
4
+ 2024-07-22 13:21:03,949 INFO ___FILE_ONLY___ Beginning update. This process may take several minutes.
5
+
6
+ 2024-07-22 13:21:03,958 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
7
+ 2024-07-22 13:21:04,033 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components-2.json HTTP/1.1" 200 223375
8
+ 2024-07-22 13:21:04,055 WARNING root Component [compute] no longer exists.
9
+ 2024-07-22 13:21:04,056 WARNING root Component [preview] no longer exists.
10
+ 2024-07-22 13:21:04,057 INFO ___FILE_ONLY___
11
+
12
+ 2024-07-22 13:21:04,057 INFO ___FILE_ONLY___
13
+ Your current Google Cloud CLI version is: 484.0.0
14
+
15
+ 2024-07-22 13:21:04,057 INFO ___FILE_ONLY___ Installing components from version: 484.0.0
16
+
17
+ 2024-07-22 13:21:04,058 INFO ___FILE_ONLY___
18
+
19
+ 2024-07-22 13:21:04,058 DEBUG root Chosen display Format:table[box,title="These components will be removed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
20
+ 2024-07-22 13:21:04,059 DEBUG root Chosen display Format:table[box,title="These components will be updated."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
21
+ 2024-07-22 13:21:04,059 DEBUG root Chosen display Format:table[box,title="These components will be installed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
22
+ 2024-07-22 13:21:04,121 INFO ___FILE_ONLY___ ┌──────────────────────────────────────────────┐
23
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___
24
+
25
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___ │ These components will be installed. │
26
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___
27
+
28
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___ ├───────────────────────┬────────────┬─────────┤
29
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___
30
+
31
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___ │ Name │ Version │ Size │
32
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___
33
+
34
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___ ├───────────────────────┼────────────┼─────────┤
35
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___
36
+
37
+ 2024-07-22 13:21:04,122 INFO ___FILE_ONLY___ │
38
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ gcloud Alpha Commands
39
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___
40
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ │
41
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ 2024.07.12
42
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___
43
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ │
44
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ < 1 MiB
45
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___
46
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ │
47
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___
48
+
49
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ │
50
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___ gcloud Beta Commands
51
+ 2024-07-22 13:21:04,123 INFO ___FILE_ONLY___
52
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___ │
53
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___ 2024.07.12
54
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___
55
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___ │
56
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___ < 1 MiB
57
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___
58
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___ │
59
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___
60
+
61
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___ └───────────────────────┴────────────┴─────────┘
62
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___
63
+
64
+ 2024-07-22 13:21:04,124 INFO ___FILE_ONLY___
65
+
66
+ 2024-07-22 13:21:04,129 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
67
+ 2024-07-22 13:21:04,209 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/RELEASE_NOTES HTTP/1.1" 200 1243506
68
+ 2024-07-22 13:21:04,337 INFO ___FILE_ONLY___ For the latest full release notes, please visit:
69
+ https://cloud.google.com/sdk/release_notes
70
+
71
+
72
+ 2024-07-22 13:21:04,338 INFO ___FILE_ONLY___ Performing in place update...
73
+
74
+
75
+ 2024-07-22 13:21:04,340 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
76
+
77
+ 2024-07-22 13:21:04,340 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud Alpha Commands ═╣
78
+
79
+ 2024-07-22 13:21:04,340 INFO ___FILE_ONLY___ ╚
80
+ 2024-07-22 13:21:04,344 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
81
+ 2024-07-22 13:21:04,419 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-alpha-20240712142834.tar.gz HTTP/1.1" 200 800
82
+ 2024-07-22 13:21:04,420 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
83
+ 2024-07-22 13:21:04,420 INFO ___FILE_ONLY___ ╝
84
+
85
+ 2024-07-22 13:21:04,422 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
86
+
87
+ 2024-07-22 13:21:04,422 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud Beta Commands ═╣
88
+
89
+ 2024-07-22 13:21:04,422 INFO ___FILE_ONLY___ ╚
90
+ 2024-07-22 13:21:04,428 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
91
+ 2024-07-22 13:21:04,499 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-beta-20240712142834.tar.gz HTTP/1.1" 200 797
92
+ 2024-07-22 13:21:04,500 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
93
+ 2024-07-22 13:21:04,500 INFO ___FILE_ONLY___ ╝
94
+
95
+ 2024-07-22 13:21:04,502 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
96
+
97
+ 2024-07-22 13:21:04,502 INFO ___FILE_ONLY___ ╠═ Installing: gcloud Alpha Commands ═╣
98
+
99
+ 2024-07-22 13:21:04,502 INFO ___FILE_ONLY___ ╚
100
+ 2024-07-22 13:21:04,504 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
101
+ 2024-07-22 13:21:04,504 INFO ___FILE_ONLY___ ╝
102
+
103
+ 2024-07-22 13:21:04,511 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
104
+
105
+ 2024-07-22 13:21:04,511 INFO ___FILE_ONLY___ ╠═ Installing: gcloud Beta Commands ═╣
106
+
107
+ 2024-07-22 13:21:04,512 INFO ___FILE_ONLY___ ╚
108
+ 2024-07-22 13:21:04,513 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
109
+ 2024-07-22 13:21:04,513 INFO ___FILE_ONLY___ ╝
110
+
111
+ 2024-07-22 13:21:04,522 DEBUG root Updating notification cache...
112
+ 2024-07-22 13:21:04,523 INFO ___FILE_ONLY___
113
+
114
+ 2024-07-22 13:21:04,525 INFO ___FILE_ONLY___ Performing post processing steps...
115
+ 2024-07-22 13:21:04,526 DEBUG root Executing command: ['/tools/google-cloud-sdk/bin/gcloud', 'components', 'post-process']
116
+ 2024-07-22 13:21:15,923 DEBUG ___FILE_ONLY___
117
+ 2024-07-22 13:21:15,923 DEBUG ___FILE_ONLY___
118
+ 2024-07-22 13:21:16,042 INFO ___FILE_ONLY___
119
+ Update done!
120
+
121
+
122
+ 2024-07-22 13:21:16,046 DEBUG root Chosen display Format:none
123
+ 2024-07-22 13:21:16,046 INFO root Display format: "none"
.config/logs/2024.07.22/13.21.05.029456.log ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ 2024-07-22 13:21:05,030 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2024-07-22 13:21:05,032 DEBUG root Loaded Command Group: ['gcloud', 'components', 'post_process']
3
+ 2024-07-22 13:21:05,035 DEBUG root Running [gcloud.components.post-process] with arguments: []
4
+ 2024-07-22 13:21:15,828 DEBUG root Chosen display Format:none
5
+ 2024-07-22 13:21:15,828 INFO root Display format: "none"
.config/logs/2024.07.22/13.21.16.643722.log ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ 2024-07-22 13:21:16,646 DEBUG root Loaded Command Group: ['gcloud', 'config']
2
+ 2024-07-22 13:21:16,702 DEBUG root Loaded Command Group: ['gcloud', 'config', 'set']
3
+ 2024-07-22 13:21:16,706 DEBUG root Running [gcloud.config.set] with arguments: [SECTION/PROPERTY: "component_manager/disable_update_check", VALUE: "true"]
4
+ 2024-07-22 13:21:16,707 INFO ___FILE_ONLY___ Updated property [component_manager/disable_update_check].
5
+
6
+ 2024-07-22 13:21:16,708 DEBUG root Chosen display Format:default
7
+ 2024-07-22 13:21:16,709 INFO root Display format: "default"
8
+ 2024-07-22 13:21:16,709 DEBUG root SDK update checks are disabled.
.config/logs/2024.07.22/13.21.17.317802.log ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ 2024-07-22 13:21:17,320 DEBUG root Loaded Command Group: ['gcloud', 'config']
2
+ 2024-07-22 13:21:17,377 DEBUG root Loaded Command Group: ['gcloud', 'config', 'set']
3
+ 2024-07-22 13:21:17,380 DEBUG root Running [gcloud.config.set] with arguments: [SECTION/PROPERTY: "compute/gce_metadata_read_timeout_sec", VALUE: "0"]
4
+ 2024-07-22 13:21:17,381 INFO ___FILE_ONLY___ Updated property [compute/gce_metadata_read_timeout_sec].
5
+
6
+ 2024-07-22 13:21:17,382 DEBUG root Chosen display Format:default
7
+ 2024-07-22 13:21:17,383 INFO root Display format: "default"
8
+ 2024-07-22 13:21:17,383 DEBUG root SDK update checks are disabled.
.gitattributes CHANGED
@@ -33,3 +33,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ diffusers/examples/research_projects/gligen/generated-images-100000-00.png filter=lfs diff=lfs merge=lfs -text
37
+ sample_data/mnist_test.csv filter=lfs diff=lfs merge=lfs -text
38
+ sample_data/mnist_train_small.csv filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: runwayml/stable-diffusion-v1-5
3
+ library_name: diffusers
4
+ license: creativeml-openrail-m
5
+ tags:
6
+ - text-to-image
7
+ - dreambooth
8
+ - diffusers-training
9
+ - stable-diffusion
10
+ - stable-diffusion-diffusers
11
+ inference: true
12
+ instance_prompt: a photo of kirby
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the training script had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+
19
+ # DreamBooth - ShadeEngine/content
20
+
21
+ This is a dreambooth model derived from runwayml/stable-diffusion-v1-5. The weights were trained on a photo of kirby using [DreamBooth](https://dreambooth.github.io/).
22
+ You can find some example images in the following.
23
+
24
+
25
+
26
+ DreamBooth for the text encoder was enabled: False.
27
+
28
+
29
+ ## Intended uses & limitations
30
+
31
+ #### How to use
32
+
33
+ ```python
34
+ # TODO: add an example code snippet for running this diffusion pipeline
35
+ ```
36
+
37
+ #### Limitations and bias
38
+
39
+ [TODO: provide examples of latent issues and potential remediations]
40
+
41
+ ## Training details
42
+
43
+ [TODO: describe the data used to train the model]
diffusers/.github/ISSUE_TEMPLATE/bug-report.yml ADDED
@@ -0,0 +1,110 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: "\U0001F41B Bug Report"
2
+ description: Report a bug on Diffusers
3
+ labels: [ "bug" ]
4
+ body:
5
+ - type: markdown
6
+ attributes:
7
+ value: |
8
+ Thanks a lot for taking the time to file this issue 🤗.
9
+ Issues do not only help to improve the library, but also publicly document common problems, questions, workflows for the whole community!
10
+ Thus, issues are of the same importance as pull requests when contributing to this library ❤️.
11
+ In order to make your issue as **useful for the community as possible**, let's try to stick to some simple guidelines:
12
+ - 1. Please try to be as precise and concise as possible.
13
+ *Give your issue a fitting title. Assume that someone which very limited knowledge of Diffusers can understand your issue. Add links to the source code, documentation other issues, pull requests etc...*
14
+ - 2. If your issue is about something not working, **always** provide a reproducible code snippet. The reader should be able to reproduce your issue by **only copy-pasting your code snippet into a Python shell**.
15
+ *The community cannot solve your issue if it cannot reproduce it. If your bug is related to training, add your training script and make everything needed to train public. Otherwise, just add a simple Python code snippet.*
16
+ - 3. Add the **minimum** amount of code / context that is needed to understand, reproduce your issue.
17
+ *Make the life of maintainers easy. `diffusers` is getting many issues every day. Make sure your issue is about one bug and one bug only. Make sure you add only the context, code needed to understand your issues - nothing more. Generally, every issue is a way of documenting this library, try to make it a good documentation entry.*
18
+ - 4. For issues related to community pipelines (i.e., the pipelines located in the `examples/community` folder), please tag the author of the pipeline in your issue thread as those pipelines are not maintained.
19
+ - type: markdown
20
+ attributes:
21
+ value: |
22
+ For more in-detail information on how to write good issues you can have a look [here](https://huggingface.co/course/chapter8/5?fw=pt).
23
+ - type: textarea
24
+ id: bug-description
25
+ attributes:
26
+ label: Describe the bug
27
+ description: A clear and concise description of what the bug is. If you intend to submit a pull request for this issue, tell us in the description. Thanks!
28
+ placeholder: Bug description
29
+ validations:
30
+ required: true
31
+ - type: textarea
32
+ id: reproduction
33
+ attributes:
34
+ label: Reproduction
35
+ description: Please provide a minimal reproducible code which we can copy/paste and reproduce the issue.
36
+ placeholder: Reproduction
37
+ validations:
38
+ required: true
39
+ - type: textarea
40
+ id: logs
41
+ attributes:
42
+ label: Logs
43
+ description: "Please include the Python logs if you can."
44
+ render: shell
45
+ - type: textarea
46
+ id: system-info
47
+ attributes:
48
+ label: System Info
49
+ description: Please share your system info with us. You can run the command `diffusers-cli env` and copy-paste its output below.
50
+ placeholder: Diffusers version, platform, Python version, ...
51
+ validations:
52
+ required: true
53
+ - type: textarea
54
+ id: who-can-help
55
+ attributes:
56
+ label: Who can help?
57
+ description: |
58
+ Your issue will be replied to more quickly if you can figure out the right person to tag with @.
59
+ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
60
+
61
+ All issues are read by one of the core maintainers, so if you don't know who to tag, just leave this blank and
62
+ a core maintainer will ping the right person.
63
+
64
+ Please tag a maximum of 2 people.
65
+
66
+ Questions on DiffusionPipeline (Saving, Loading, From pretrained, ...): @sayakpaul @DN6
67
+
68
+ Questions on pipelines:
69
+ - Stable Diffusion @yiyixuxu @asomoza
70
+ - Stable Diffusion XL @yiyixuxu @sayakpaul @DN6
71
+ - Stable Diffusion 3: @yiyixuxu @sayakpaul @DN6 @asomoza
72
+ - Kandinsky @yiyixuxu
73
+ - ControlNet @sayakpaul @yiyixuxu @DN6
74
+ - T2I Adapter @sayakpaul @yiyixuxu @DN6
75
+ - IF @DN6
76
+ - Text-to-Video / Video-to-Video @DN6 @a-r-r-o-w
77
+ - Wuerstchen @DN6
78
+ - Other: @yiyixuxu @DN6
79
+ - Improving generation quality: @asomoza
80
+
81
+ Questions on models:
82
+ - UNet @DN6 @yiyixuxu @sayakpaul
83
+ - VAE @sayakpaul @DN6 @yiyixuxu
84
+ - Transformers/Attention @DN6 @yiyixuxu @sayakpaul
85
+
86
+ Questions on single file checkpoints: @DN6
87
+
88
+ Questions on Schedulers: @yiyixuxu
89
+
90
+ Questions on LoRA: @sayakpaul
91
+
92
+ Questions on Textual Inversion: @sayakpaul
93
+
94
+ Questions on Training:
95
+ - DreamBooth @sayakpaul
96
+ - Text-to-Image Fine-tuning @sayakpaul
97
+ - Textual Inversion @sayakpaul
98
+ - ControlNet @sayakpaul
99
+
100
+ Questions on Tests: @DN6 @sayakpaul @yiyixuxu
101
+
102
+ Questions on Documentation: @stevhliu
103
+
104
+ Questions on JAX- and MPS-related things: @pcuenca
105
+
106
+ Questions on audio pipelines: @sanchit-gandhi
107
+
108
+
109
+
110
+ placeholder: "@Username ..."
diffusers/.github/ISSUE_TEMPLATE/config.yml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ contact_links:
2
+ - name: Questions / Discussions
3
+ url: https://github.com/huggingface/diffusers/discussions
4
+ about: General usage questions and community discussions
diffusers/.github/ISSUE_TEMPLATE/feature_request.md ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ name: "\U0001F680 Feature Request"
3
+ about: Suggest an idea for this project
4
+ title: ''
5
+ labels: ''
6
+ assignees: ''
7
+
8
+ ---
9
+
10
+ **Is your feature request related to a problem? Please describe.**
11
+ A clear and concise description of what the problem is. Ex. I'm always frustrated when [...].
12
+
13
+ **Describe the solution you'd like.**
14
+ A clear and concise description of what you want to happen.
15
+
16
+ **Describe alternatives you've considered.**
17
+ A clear and concise description of any alternative solutions or features you've considered.
18
+
19
+ **Additional context.**
20
+ Add any other context or screenshots about the feature request here.
diffusers/.github/ISSUE_TEMPLATE/feedback.md ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ name: "💬 Feedback about API Design"
3
+ about: Give feedback about the current API design
4
+ title: ''
5
+ labels: ''
6
+ assignees: ''
7
+
8
+ ---
9
+
10
+ **What API design would you like to have changed or added to the library? Why?**
11
+
12
+ **What use case would this enable or better enable? Can you give us a code example?**
diffusers/.github/ISSUE_TEMPLATE/new-model-addition.yml ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: "\U0001F31F New Model/Pipeline/Scheduler Addition"
2
+ description: Submit a proposal/request to implement a new diffusion model/pipeline/scheduler
3
+ labels: [ "New model/pipeline/scheduler" ]
4
+
5
+ body:
6
+ - type: textarea
7
+ id: description-request
8
+ validations:
9
+ required: true
10
+ attributes:
11
+ label: Model/Pipeline/Scheduler description
12
+ description: |
13
+ Put any and all important information relative to the model/pipeline/scheduler
14
+
15
+ - type: checkboxes
16
+ id: information-tasks
17
+ attributes:
18
+ label: Open source status
19
+ description: |
20
+ Please note that if the model implementation isn't available or if the weights aren't open-source, we are less likely to implement it in `diffusers`.
21
+ options:
22
+ - label: "The model implementation is available."
23
+ - label: "The model weights are available (Only relevant if addition is not a scheduler)."
24
+
25
+ - type: textarea
26
+ id: additional-info
27
+ attributes:
28
+ label: Provide useful links for the implementation
29
+ description: |
30
+ Please provide information regarding the implementation, the weights, and the authors.
31
+ Please mention the authors by @gh-username if you're aware of their usernames.
diffusers/.github/ISSUE_TEMPLATE/translate.md ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ name: 🌐 Translating a New Language?
3
+ about: Start a new translation effort in your language
4
+ title: '[<languageCode>] Translating docs to <languageName>'
5
+ labels: WIP
6
+ assignees: ''
7
+
8
+ ---
9
+
10
+ <!--
11
+ Note: Please search to see if an issue already exists for the language you are trying to translate.
12
+ -->
13
+
14
+ Hi!
15
+
16
+ Let's bring the documentation to all the <languageName>-speaking community 🌐.
17
+
18
+ Who would want to translate? Please follow the 🤗 [TRANSLATING guide](https://github.com/huggingface/diffusers/blob/main/docs/TRANSLATING.md). Here is a list of the files ready for translation. Let us know in this issue if you'd like to translate any, and we'll add your name to the list.
19
+
20
+ Some notes:
21
+
22
+ * Please translate using an informal tone (imagine you are talking with a friend about Diffusers 🤗).
23
+ * Please translate in a gender-neutral way.
24
+ * Add your translations to the folder called `<languageCode>` inside the [source folder](https://github.com/huggingface/diffusers/tree/main/docs/source).
25
+ * Register your translation in `<languageCode>/_toctree.yml`; please follow the order of the [English version](https://github.com/huggingface/diffusers/blob/main/docs/source/en/_toctree.yml).
26
+ * Once you're finished, open a pull request and tag this issue by including #issue-number in the description, where issue-number is the number of this issue. Please ping @stevhliu for review.
27
+ * 🙋 If you'd like others to help you with the translation, you can also post in the 🤗 [forums](https://discuss.huggingface.co/c/discussion-related-to-httpsgithubcomhuggingfacediffusers/63).
28
+
29
+ Thank you so much for your help! 🤗
diffusers/.github/PULL_REQUEST_TEMPLATE.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # What does this PR do?
2
+
3
+ <!--
4
+ Congratulations! You've made it this far! You're not quite done yet though.
5
+
6
+ Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
7
+
8
+ Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
9
+
10
+ Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
11
+ -->
12
+
13
+ <!-- Remove if not applicable -->
14
+
15
+ Fixes # (issue)
16
+
17
+
18
+ ## Before submitting
19
+ - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
20
+ - [ ] Did you read the [contributor guideline](https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md)?
21
+ - [ ] Did you read our [philosophy doc](https://github.com/huggingface/diffusers/blob/main/PHILOSOPHY.md) (important for complex PRs)?
22
+ - [ ] Was this discussed/approved via a GitHub issue or the [forum](https://discuss.huggingface.co/c/discussion-related-to-httpsgithubcomhuggingfacediffusers/63)? Please add a link to it if that's the case.
23
+ - [ ] Did you make sure to update the documentation with your changes? Here are the
24
+ [documentation guidelines](https://github.com/huggingface/diffusers/tree/main/docs), and
25
+ [here are tips on formatting docstrings](https://github.com/huggingface/diffusers/tree/main/docs#writing-source-documentation).
26
+ - [ ] Did you write any new necessary tests?
27
+
28
+
29
+ ## Who can review?
30
+
31
+ Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
32
+ members/contributors who may be interested in your PR.
33
+
34
+ <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @.
35
+
36
+ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
37
+ Please tag fewer than 3 people.
38
+
39
+ Core library:
40
+
41
+ - Schedulers: @yiyixuxu
42
+ - Pipelines and pipeline callbacks: @yiyixuxu and @asomoza
43
+ - Training examples: @sayakpaul
44
+ - Docs: @stevhliu and @sayakpaul
45
+ - JAX and MPS: @pcuenca
46
+ - Audio: @sanchit-gandhi
47
+ - General functionalities: @sayakpaul @yiyixuxu @DN6
48
+
49
+ Integrations:
50
+
51
+ - deepspeed: HF Trainer/Accelerate: @SunMarc
52
+ - PEFT: @sayakpaul @BenjaminBossan
53
+
54
+ HF projects:
55
+
56
+ - accelerate: [different repo](https://github.com/huggingface/accelerate)
57
+ - datasets: [different repo](https://github.com/huggingface/datasets)
58
+ - transformers: [different repo](https://github.com/huggingface/transformers)
59
+ - safetensors: [different repo](https://github.com/huggingface/safetensors)
60
+
61
+ -->
diffusers/.github/actions/setup-miniconda/action.yml ADDED
@@ -0,0 +1,146 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Set up conda environment for testing
2
+
3
+ description: Sets up miniconda in your ${RUNNER_TEMP} environment and gives you the ${CONDA_RUN} environment variable so you don't have to worry about polluting non-empeheral runners anymore
4
+
5
+ inputs:
6
+ python-version:
7
+ description: If set to any value, don't use sudo to clean the workspace
8
+ required: false
9
+ type: string
10
+ default: "3.9"
11
+ miniconda-version:
12
+ description: Miniconda version to install
13
+ required: false
14
+ type: string
15
+ default: "4.12.0"
16
+ environment-file:
17
+ description: Environment file to install dependencies from
18
+ required: false
19
+ type: string
20
+ default: ""
21
+
22
+ runs:
23
+ using: composite
24
+ steps:
25
+ # Use the same trick from https://github.com/marketplace/actions/setup-miniconda
26
+ # to refresh the cache daily. This is kind of optional though
27
+ - name: Get date
28
+ id: get-date
29
+ shell: bash
30
+ run: echo "today=$(/bin/date -u '+%Y%m%d')d" >> $GITHUB_OUTPUT
31
+ - name: Setup miniconda cache
32
+ id: miniconda-cache
33
+ uses: actions/cache@v2
34
+ with:
35
+ path: ${{ runner.temp }}/miniconda
36
+ key: miniconda-${{ runner.os }}-${{ runner.arch }}-${{ inputs.python-version }}-${{ steps.get-date.outputs.today }}
37
+ - name: Install miniconda (${{ inputs.miniconda-version }})
38
+ if: steps.miniconda-cache.outputs.cache-hit != 'true'
39
+ env:
40
+ MINICONDA_VERSION: ${{ inputs.miniconda-version }}
41
+ shell: bash -l {0}
42
+ run: |
43
+ MINICONDA_INSTALL_PATH="${RUNNER_TEMP}/miniconda"
44
+ mkdir -p "${MINICONDA_INSTALL_PATH}"
45
+ case ${RUNNER_OS}-${RUNNER_ARCH} in
46
+ Linux-X64)
47
+ MINICONDA_ARCH="Linux-x86_64"
48
+ ;;
49
+ macOS-ARM64)
50
+ MINICONDA_ARCH="MacOSX-arm64"
51
+ ;;
52
+ macOS-X64)
53
+ MINICONDA_ARCH="MacOSX-x86_64"
54
+ ;;
55
+ *)
56
+ echo "::error::Platform ${RUNNER_OS}-${RUNNER_ARCH} currently unsupported using this action"
57
+ exit 1
58
+ ;;
59
+ esac
60
+ MINICONDA_URL="https://repo.anaconda.com/miniconda/Miniconda3-py39_${MINICONDA_VERSION}-${MINICONDA_ARCH}.sh"
61
+ curl -fsSL "${MINICONDA_URL}" -o "${MINICONDA_INSTALL_PATH}/miniconda.sh"
62
+ bash "${MINICONDA_INSTALL_PATH}/miniconda.sh" -b -u -p "${MINICONDA_INSTALL_PATH}"
63
+ rm -rf "${MINICONDA_INSTALL_PATH}/miniconda.sh"
64
+ - name: Update GitHub path to include miniconda install
65
+ shell: bash
66
+ run: |
67
+ MINICONDA_INSTALL_PATH="${RUNNER_TEMP}/miniconda"
68
+ echo "${MINICONDA_INSTALL_PATH}/bin" >> $GITHUB_PATH
69
+ - name: Setup miniconda env cache (with env file)
70
+ id: miniconda-env-cache-env-file
71
+ if: ${{ runner.os }} == 'macOS' && ${{ inputs.environment-file }} != ''
72
+ uses: actions/cache@v2
73
+ with:
74
+ path: ${{ runner.temp }}/conda-python-${{ inputs.python-version }}
75
+ key: miniconda-env-${{ runner.os }}-${{ runner.arch }}-${{ inputs.python-version }}-${{ steps.get-date.outputs.today }}-${{ hashFiles(inputs.environment-file) }}
76
+ - name: Setup miniconda env cache (without env file)
77
+ id: miniconda-env-cache
78
+ if: ${{ runner.os }} == 'macOS' && ${{ inputs.environment-file }} == ''
79
+ uses: actions/cache@v2
80
+ with:
81
+ path: ${{ runner.temp }}/conda-python-${{ inputs.python-version }}
82
+ key: miniconda-env-${{ runner.os }}-${{ runner.arch }}-${{ inputs.python-version }}-${{ steps.get-date.outputs.today }}
83
+ - name: Setup conda environment with python (v${{ inputs.python-version }})
84
+ if: steps.miniconda-env-cache-env-file.outputs.cache-hit != 'true' && steps.miniconda-env-cache.outputs.cache-hit != 'true'
85
+ shell: bash
86
+ env:
87
+ PYTHON_VERSION: ${{ inputs.python-version }}
88
+ ENV_FILE: ${{ inputs.environment-file }}
89
+ run: |
90
+ CONDA_BASE_ENV="${RUNNER_TEMP}/conda-python-${PYTHON_VERSION}"
91
+ ENV_FILE_FLAG=""
92
+ if [[ -f "${ENV_FILE}" ]]; then
93
+ ENV_FILE_FLAG="--file ${ENV_FILE}"
94
+ elif [[ -n "${ENV_FILE}" ]]; then
95
+ echo "::warning::Specified env file (${ENV_FILE}) not found, not going to include it"
96
+ fi
97
+ conda create \
98
+ --yes \
99
+ --prefix "${CONDA_BASE_ENV}" \
100
+ "python=${PYTHON_VERSION}" \
101
+ ${ENV_FILE_FLAG} \
102
+ cmake=3.22 \
103
+ conda-build=3.21 \
104
+ ninja=1.10 \
105
+ pkg-config=0.29 \
106
+ wheel=0.37
107
+ - name: Clone the base conda environment and update GitHub env
108
+ shell: bash
109
+ env:
110
+ PYTHON_VERSION: ${{ inputs.python-version }}
111
+ CONDA_BASE_ENV: ${{ runner.temp }}/conda-python-${{ inputs.python-version }}
112
+ run: |
113
+ CONDA_ENV="${RUNNER_TEMP}/conda_environment_${GITHUB_RUN_ID}"
114
+ conda create \
115
+ --yes \
116
+ --prefix "${CONDA_ENV}" \
117
+ --clone "${CONDA_BASE_ENV}"
118
+ # TODO: conda-build could not be cloned because it hardcodes the path, so it
119
+ # could not be cached
120
+ conda install --yes -p ${CONDA_ENV} conda-build=3.21
121
+ echo "CONDA_ENV=${CONDA_ENV}" >> "${GITHUB_ENV}"
122
+ echo "CONDA_RUN=conda run -p ${CONDA_ENV} --no-capture-output" >> "${GITHUB_ENV}"
123
+ echo "CONDA_BUILD=conda run -p ${CONDA_ENV} conda-build" >> "${GITHUB_ENV}"
124
+ echo "CONDA_INSTALL=conda install -p ${CONDA_ENV}" >> "${GITHUB_ENV}"
125
+ - name: Get disk space usage and throw an error for low disk space
126
+ shell: bash
127
+ run: |
128
+ echo "Print the available disk space for manual inspection"
129
+ df -h
130
+ # Set the minimum requirement space to 4GB
131
+ MINIMUM_AVAILABLE_SPACE_IN_GB=4
132
+ MINIMUM_AVAILABLE_SPACE_IN_KB=$(($MINIMUM_AVAILABLE_SPACE_IN_GB * 1024 * 1024))
133
+ # Use KB to avoid floating point warning like 3.1GB
134
+ df -k | tr -s ' ' | cut -d' ' -f 4,9 | while read -r LINE;
135
+ do
136
+ AVAIL=$(echo $LINE | cut -f1 -d' ')
137
+ MOUNT=$(echo $LINE | cut -f2 -d' ')
138
+ if [ "$MOUNT" = "/" ]; then
139
+ if [ "$AVAIL" -lt "$MINIMUM_AVAILABLE_SPACE_IN_KB" ]; then
140
+ echo "There is only ${AVAIL}KB free space left in $MOUNT, which is less than the minimum requirement of ${MINIMUM_AVAILABLE_SPACE_IN_KB}KB. Please help create an issue to PyTorch Release Engineering via https://github.com/pytorch/test-infra/issues and provide the link to the workflow run."
141
+ exit 1;
142
+ else
143
+ echo "There is ${AVAIL}KB free space left in $MOUNT, continue"
144
+ fi
145
+ fi
146
+ done
diffusers/.github/workflows/benchmark.yml ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Benchmarking tests
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ schedule:
6
+ - cron: "30 1 1,15 * *" # every 2 weeks on the 1st and the 15th of every month at 1:30 AM
7
+
8
+ env:
9
+ DIFFUSERS_IS_CI: yes
10
+ HF_HOME: /mnt/cache
11
+ OMP_NUM_THREADS: 8
12
+ MKL_NUM_THREADS: 8
13
+
14
+ jobs:
15
+ torch_pipelines_cuda_benchmark_tests:
16
+ env:
17
+ SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL_BENCHMARK }}
18
+ name: Torch Core Pipelines CUDA Benchmarking Tests
19
+ strategy:
20
+ fail-fast: false
21
+ max-parallel: 1
22
+ runs-on:
23
+ group: aws-g6-4xlarge-plus
24
+ container:
25
+ image: diffusers/diffusers-pytorch-compile-cuda
26
+ options: --shm-size "16gb" --ipc host --gpus 0
27
+ steps:
28
+ - name: Checkout diffusers
29
+ uses: actions/checkout@v3
30
+ with:
31
+ fetch-depth: 2
32
+ - name: NVIDIA-SMI
33
+ run: |
34
+ nvidia-smi
35
+ - name: Install dependencies
36
+ run: |
37
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
38
+ python -m uv pip install -e [quality,test]
39
+ python -m uv pip install pandas peft
40
+ - name: Environment
41
+ run: |
42
+ python utils/print_env.py
43
+ - name: Diffusers Benchmarking
44
+ env:
45
+ HF_TOKEN: ${{ secrets.DIFFUSERS_BOT_TOKEN }}
46
+ BASE_PATH: benchmark_outputs
47
+ run: |
48
+ export TOTAL_GPU_MEMORY=$(python -c "import torch; print(torch.cuda.get_device_properties(0).total_memory / (1024**3))")
49
+ cd benchmarks && mkdir ${BASE_PATH} && python run_all.py && python push_results.py
50
+
51
+ - name: Test suite reports artifacts
52
+ if: ${{ always() }}
53
+ uses: actions/upload-artifact@v2
54
+ with:
55
+ name: benchmark_test_reports
56
+ path: benchmarks/benchmark_outputs
57
+
58
+ - name: Report success status
59
+ if: ${{ success() }}
60
+ run: |
61
+ pip install requests && python utils/notify_benchmarking_status.py --status=success
62
+
63
+ - name: Report failure status
64
+ if: ${{ failure() }}
65
+ run: |
66
+ pip install requests && python utils/notify_benchmarking_status.py --status=failure
diffusers/.github/workflows/build_docker_images.yml ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Test, build, and push Docker images
2
+
3
+ on:
4
+ pull_request: # During PRs, we just check if the changes Dockerfiles can be successfully built
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "docker/**"
9
+ workflow_dispatch:
10
+ schedule:
11
+ - cron: "0 0 * * *" # every day at midnight
12
+
13
+ concurrency:
14
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
15
+ cancel-in-progress: true
16
+
17
+ env:
18
+ REGISTRY: diffusers
19
+ CI_SLACK_CHANNEL: ${{ secrets.CI_DOCKER_CHANNEL }}
20
+
21
+ jobs:
22
+ test-build-docker-images:
23
+ runs-on: [ self-hosted, intel-cpu, 8-cpu, ci ]
24
+ if: github.event_name == 'pull_request'
25
+ steps:
26
+ - name: Set up Docker Buildx
27
+ uses: docker/setup-buildx-action@v1
28
+
29
+ - name: Check out code
30
+ uses: actions/checkout@v3
31
+
32
+ - name: Find Changed Dockerfiles
33
+ id: file_changes
34
+ uses: jitterbit/get-changed-files@v1
35
+ with:
36
+ format: 'space-delimited'
37
+ token: ${{ secrets.GITHUB_TOKEN }}
38
+
39
+ - name: Build Changed Docker Images
40
+ run: |
41
+ CHANGED_FILES="${{ steps.file_changes.outputs.all }}"
42
+ for FILE in $CHANGED_FILES; do
43
+ if [[ "$FILE" == docker/*Dockerfile ]]; then
44
+ DOCKER_PATH="${FILE%/Dockerfile}"
45
+ DOCKER_TAG=$(basename "$DOCKER_PATH")
46
+ echo "Building Docker image for $DOCKER_TAG"
47
+ docker build -t "$DOCKER_TAG" "$DOCKER_PATH"
48
+ fi
49
+ done
50
+ if: steps.file_changes.outputs.all != ''
51
+
52
+ build-and-push-docker-images:
53
+ runs-on: [ self-hosted, intel-cpu, 8-cpu, ci ]
54
+ if: github.event_name != 'pull_request'
55
+
56
+ permissions:
57
+ contents: read
58
+ packages: write
59
+
60
+ strategy:
61
+ fail-fast: false
62
+ matrix:
63
+ image-name:
64
+ - diffusers-pytorch-cpu
65
+ - diffusers-pytorch-cuda
66
+ - diffusers-pytorch-compile-cuda
67
+ - diffusers-pytorch-xformers-cuda
68
+ - diffusers-flax-cpu
69
+ - diffusers-flax-tpu
70
+ - diffusers-onnxruntime-cpu
71
+ - diffusers-onnxruntime-cuda
72
+ - diffusers-doc-builder
73
+
74
+ steps:
75
+ - name: Checkout repository
76
+ uses: actions/checkout@v3
77
+ - name: Set up Docker Buildx
78
+ uses: docker/setup-buildx-action@v1
79
+ - name: Login to Docker Hub
80
+ uses: docker/login-action@v2
81
+ with:
82
+ username: ${{ env.REGISTRY }}
83
+ password: ${{ secrets.DOCKERHUB_TOKEN }}
84
+ - name: Build and push
85
+ uses: docker/build-push-action@v3
86
+ with:
87
+ no-cache: true
88
+ context: ./docker/${{ matrix.image-name }}
89
+ push: true
90
+ tags: ${{ env.REGISTRY }}/${{ matrix.image-name }}:latest
91
+
92
+ - name: Post to a Slack channel
93
+ id: slack
94
+ uses: huggingface/hf-workflows/.github/actions/post-slack@main
95
+ with:
96
+ # Slack channel id, channel name, or user id to post message.
97
+ # See also: https://api.slack.com/methods/chat.postMessage#channels
98
+ slack_channel: ${{ env.CI_SLACK_CHANNEL }}
99
+ title: "🤗 Results of the ${{ matrix.image-name }} Docker Image build"
100
+ status: ${{ job.status }}
101
+ slack_token: ${{ secrets.SLACK_CIFEEDBACK_BOT_TOKEN }}
diffusers/.github/workflows/build_documentation.yml ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Build documentation
2
+
3
+ on:
4
+ push:
5
+ branches:
6
+ - main
7
+ - doc-builder*
8
+ - v*-release
9
+ - v*-patch
10
+ paths:
11
+ - "src/diffusers/**.py"
12
+ - "examples/**"
13
+ - "docs/**"
14
+
15
+ jobs:
16
+ build:
17
+ uses: huggingface/doc-builder/.github/workflows/build_main_documentation.yml@main
18
+ with:
19
+ commit_sha: ${{ github.sha }}
20
+ install_libgl1: true
21
+ package: diffusers
22
+ notebook_folder: diffusers_doc
23
+ languages: en ko zh ja pt
24
+ custom_container: diffusers/diffusers-doc-builder
25
+ secrets:
26
+ token: ${{ secrets.HUGGINGFACE_PUSH }}
27
+ hf_token: ${{ secrets.HF_DOC_BUILD_PUSH }}
diffusers/.github/workflows/build_pr_documentation.yml ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Build PR Documentation
2
+
3
+ on:
4
+ pull_request:
5
+ paths:
6
+ - "src/diffusers/**.py"
7
+ - "examples/**"
8
+ - "docs/**"
9
+
10
+ concurrency:
11
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
12
+ cancel-in-progress: true
13
+
14
+ jobs:
15
+ build:
16
+ uses: huggingface/doc-builder/.github/workflows/build_pr_documentation.yml@main
17
+ with:
18
+ commit_sha: ${{ github.event.pull_request.head.sha }}
19
+ pr_number: ${{ github.event.number }}
20
+ install_libgl1: true
21
+ package: diffusers
22
+ languages: en ko zh ja pt
23
+ custom_container: diffusers/diffusers-doc-builder
diffusers/.github/workflows/mirror_community_pipeline.yml ADDED
@@ -0,0 +1,102 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Mirror Community Pipeline
2
+
3
+ on:
4
+ # Push changes on the main branch
5
+ push:
6
+ branches:
7
+ - main
8
+ paths:
9
+ - 'examples/community/**.py'
10
+
11
+ # And on tag creation (e.g. `v0.28.1`)
12
+ tags:
13
+ - '*'
14
+
15
+ # Manual trigger with ref input
16
+ workflow_dispatch:
17
+ inputs:
18
+ ref:
19
+ description: "Either 'main' or a tag ref"
20
+ required: true
21
+ default: 'main'
22
+
23
+ jobs:
24
+ mirror_community_pipeline:
25
+ env:
26
+ SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL_COMMUNITY_MIRROR }}
27
+
28
+ runs-on: ubuntu-latest
29
+ steps:
30
+ # Checkout to correct ref
31
+ # If workflow dispatch
32
+ # If ref is 'main', set:
33
+ # CHECKOUT_REF=refs/heads/main
34
+ # PATH_IN_REPO=main
35
+ # Else it must be a tag. Set:
36
+ # CHECKOUT_REF=refs/tags/{tag}
37
+ # PATH_IN_REPO={tag}
38
+ # If not workflow dispatch
39
+ # If ref is 'refs/heads/main' => set 'main'
40
+ # Else it must be a tag => set {tag}
41
+ - name: Set checkout_ref and path_in_repo
42
+ run: |
43
+ if [ "${{ github.event_name }}" == "workflow_dispatch" ]; then
44
+ if [ -z "${{ github.event.inputs.ref }}" ]; then
45
+ echo "Error: Missing ref input"
46
+ exit 1
47
+ elif [ "${{ github.event.inputs.ref }}" == "main" ]; then
48
+ echo "CHECKOUT_REF=refs/heads/main" >> $GITHUB_ENV
49
+ echo "PATH_IN_REPO=main" >> $GITHUB_ENV
50
+ else
51
+ echo "CHECKOUT_REF=refs/tags/${{ github.event.inputs.ref }}" >> $GITHUB_ENV
52
+ echo "PATH_IN_REPO=${{ github.event.inputs.ref }}" >> $GITHUB_ENV
53
+ fi
54
+ elif [ "${{ github.ref }}" == "refs/heads/main" ]; then
55
+ echo "CHECKOUT_REF=${{ github.ref }}" >> $GITHUB_ENV
56
+ echo "PATH_IN_REPO=main" >> $GITHUB_ENV
57
+ else
58
+ # e.g. refs/tags/v0.28.1 -> v0.28.1
59
+ echo "CHECKOUT_REF=${{ github.ref }}" >> $GITHUB_ENV
60
+ echo "PATH_IN_REPO=$(echo ${{ github.ref }} | sed 's/^refs\/tags\///')" >> $GITHUB_ENV
61
+ fi
62
+ - name: Print env vars
63
+ run: |
64
+ echo "CHECKOUT_REF: ${{ env.CHECKOUT_REF }}"
65
+ echo "PATH_IN_REPO: ${{ env.PATH_IN_REPO }}"
66
+ - uses: actions/checkout@v3
67
+ with:
68
+ ref: ${{ env.CHECKOUT_REF }}
69
+
70
+ # Setup + install dependencies
71
+ - name: Set up Python
72
+ uses: actions/setup-python@v4
73
+ with:
74
+ python-version: "3.10"
75
+ - name: Install dependencies
76
+ run: |
77
+ python -m pip install --upgrade pip
78
+ pip install --upgrade huggingface_hub
79
+
80
+ # Check secret is set
81
+ - name: whoami
82
+ run: huggingface-cli whoami
83
+ env:
84
+ HF_TOKEN: ${{ secrets.HF_TOKEN_MIRROR_COMMUNITY_PIPELINES }}
85
+
86
+ # Push to HF! (under subfolder based on checkout ref)
87
+ # https://huggingface.co/datasets/diffusers/community-pipelines-mirror
88
+ - name: Mirror community pipeline to HF
89
+ run: huggingface-cli upload diffusers/community-pipelines-mirror ./examples/community ${PATH_IN_REPO} --repo-type dataset
90
+ env:
91
+ PATH_IN_REPO: ${{ env.PATH_IN_REPO }}
92
+ HF_TOKEN: ${{ secrets.HF_TOKEN_MIRROR_COMMUNITY_PIPELINES }}
93
+
94
+ - name: Report success status
95
+ if: ${{ success() }}
96
+ run: |
97
+ pip install requests && python utils/notify_community_pipelines_mirror.py --status=success
98
+
99
+ - name: Report failure status
100
+ if: ${{ failure() }}
101
+ run: |
102
+ pip install requests && python utils/notify_community_pipelines_mirror.py --status=failure
diffusers/.github/workflows/nightly_tests.yml ADDED
@@ -0,0 +1,414 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Nightly and release tests on main/release branch
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ schedule:
6
+ - cron: "0 0 * * *" # every day at midnight
7
+
8
+ env:
9
+ DIFFUSERS_IS_CI: yes
10
+ HF_HOME: /mnt/cache
11
+ OMP_NUM_THREADS: 8
12
+ MKL_NUM_THREADS: 8
13
+ PYTEST_TIMEOUT: 600
14
+ RUN_SLOW: yes
15
+ RUN_NIGHTLY: yes
16
+ PIPELINE_USAGE_CUTOFF: 5000
17
+ SLACK_API_TOKEN: ${{ secrets.SLACK_CIFEEDBACK_BOT_TOKEN }}
18
+
19
+ jobs:
20
+ setup_torch_cuda_pipeline_matrix:
21
+ name: Setup Torch Pipelines Matrix
22
+ runs-on: diffusers/diffusers-pytorch-cpu
23
+ outputs:
24
+ pipeline_test_matrix: ${{ steps.fetch_pipeline_matrix.outputs.pipeline_test_matrix }}
25
+ steps:
26
+ - name: Checkout diffusers
27
+ uses: actions/checkout@v3
28
+ with:
29
+ fetch-depth: 2
30
+ - name: Set up Python
31
+ uses: actions/setup-python@v4
32
+ with:
33
+ python-version: "3.8"
34
+ - name: Install dependencies
35
+ run: |
36
+ pip install -e .
37
+ pip install huggingface_hub
38
+ - name: Fetch Pipeline Matrix
39
+ id: fetch_pipeline_matrix
40
+ run: |
41
+ matrix=$(python utils/fetch_torch_cuda_pipeline_test_matrix.py)
42
+ echo $matrix
43
+ echo "pipeline_test_matrix=$matrix" >> $GITHUB_OUTPUT
44
+
45
+ - name: Pipeline Tests Artifacts
46
+ if: ${{ always() }}
47
+ uses: actions/upload-artifact@v2
48
+ with:
49
+ name: test-pipelines.json
50
+ path: reports
51
+
52
+ run_nightly_tests_for_torch_pipelines:
53
+ name: Torch Pipelines CUDA Nightly Tests
54
+ needs: setup_torch_cuda_pipeline_matrix
55
+ strategy:
56
+ fail-fast: false
57
+ matrix:
58
+ module: ${{ fromJson(needs.setup_torch_cuda_pipeline_matrix.outputs.pipeline_test_matrix) }}
59
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
60
+ container:
61
+ image: diffusers/diffusers-pytorch-cuda
62
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface/diffusers:/mnt/cache/ --gpus 0
63
+ steps:
64
+ - name: Checkout diffusers
65
+ uses: actions/checkout@v3
66
+ with:
67
+ fetch-depth: 2
68
+ - name: NVIDIA-SMI
69
+ run: nvidia-smi
70
+
71
+ - name: Install dependencies
72
+ run: |
73
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
74
+ python -m uv pip install -e [quality,test]
75
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
76
+ python -m uv pip install pytest-reportlog
77
+
78
+ - name: Environment
79
+ run: |
80
+ python utils/print_env.py
81
+
82
+ - name: Nightly PyTorch CUDA checkpoint (pipelines) tests
83
+ env:
84
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
85
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
86
+ CUBLAS_WORKSPACE_CONFIG: :16:8
87
+ run: |
88
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
89
+ -s -v -k "not Flax and not Onnx" \
90
+ --make-reports=tests_pipeline_${{ matrix.module }}_cuda \
91
+ --report-log=tests_pipeline_${{ matrix.module }}_cuda.log \
92
+ tests/pipelines/${{ matrix.module }}
93
+
94
+ - name: Failure short reports
95
+ if: ${{ failure() }}
96
+ run: |
97
+ cat reports/tests_pipeline_${{ matrix.module }}_cuda_stats.txt
98
+ cat reports/tests_pipeline_${{ matrix.module }}_cuda_failures_short.txt
99
+
100
+ - name: Test suite reports artifacts
101
+ if: ${{ always() }}
102
+ uses: actions/upload-artifact@v2
103
+ with:
104
+ name: pipeline_${{ matrix.module }}_test_reports
105
+ path: reports
106
+
107
+ - name: Generate Report and Notify Channel
108
+ if: always()
109
+ run: |
110
+ pip install slack_sdk tabulate
111
+ python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
112
+
113
+ run_nightly_tests_for_other_torch_modules:
114
+ name: Torch Non-Pipelines CUDA Nightly Tests
115
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
116
+ container:
117
+ image: diffusers/diffusers-pytorch-cuda
118
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/ --gpus 0
119
+ defaults:
120
+ run:
121
+ shell: bash
122
+ strategy:
123
+ matrix:
124
+ module: [models, schedulers, others, examples]
125
+ steps:
126
+ - name: Checkout diffusers
127
+ uses: actions/checkout@v3
128
+ with:
129
+ fetch-depth: 2
130
+
131
+ - name: Install dependencies
132
+ run: |
133
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
134
+ python -m uv pip install -e [quality,test]
135
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
136
+ python -m uv pip install pytest-reportlog
137
+
138
+ - name: Environment
139
+ run: python utils/print_env.py
140
+
141
+ - name: Run nightly PyTorch CUDA tests for non-pipeline modules
142
+ if: ${{ matrix.module != 'examples'}}
143
+ env:
144
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
145
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
146
+ CUBLAS_WORKSPACE_CONFIG: :16:8
147
+ run: |
148
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
149
+ -s -v -k "not Flax and not Onnx" \
150
+ --make-reports=tests_torch_${{ matrix.module }}_cuda \
151
+ --report-log=tests_torch_${{ matrix.module }}_cuda.log \
152
+ tests/${{ matrix.module }}
153
+
154
+ - name: Run nightly example tests with Torch
155
+ if: ${{ matrix.module == 'examples' }}
156
+ env:
157
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
158
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
159
+ CUBLAS_WORKSPACE_CONFIG: :16:8
160
+ run: |
161
+ python -m uv pip install peft@git+https://github.com/huggingface/peft.git
162
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
163
+ -s -v --make-reports=examples_torch_cuda \
164
+ --report-log=examples_torch_cuda.log \
165
+ examples/
166
+
167
+ - name: Failure short reports
168
+ if: ${{ failure() }}
169
+ run: |
170
+ cat reports/tests_torch_${{ matrix.module }}_cuda_stats.txt
171
+ cat reports/tests_torch_${{ matrix.module }}_cuda_failures_short.txt
172
+
173
+ - name: Test suite reports artifacts
174
+ if: ${{ always() }}
175
+ uses: actions/upload-artifact@v2
176
+ with:
177
+ name: torch_${{ matrix.module }}_cuda_test_reports
178
+ path: reports
179
+
180
+ - name: Generate Report and Notify Channel
181
+ if: always()
182
+ run: |
183
+ pip install slack_sdk tabulate
184
+ python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
185
+
186
+ run_lora_nightly_tests:
187
+ name: Nightly LoRA Tests with PEFT and TORCH
188
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
189
+ container:
190
+ image: diffusers/diffusers-pytorch-cuda
191
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/ --gpus 0
192
+ defaults:
193
+ run:
194
+ shell: bash
195
+ steps:
196
+ - name: Checkout diffusers
197
+ uses: actions/checkout@v3
198
+ with:
199
+ fetch-depth: 2
200
+
201
+ - name: Install dependencies
202
+ run: |
203
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
204
+ python -m uv pip install -e [quality,test]
205
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
206
+ python -m uv pip install peft@git+https://github.com/huggingface/peft.git
207
+ python -m uv pip install pytest-reportlog
208
+
209
+ - name: Environment
210
+ run: python utils/print_env.py
211
+
212
+ - name: Run nightly LoRA tests with PEFT and Torch
213
+ env:
214
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
215
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
216
+ CUBLAS_WORKSPACE_CONFIG: :16:8
217
+ run: |
218
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
219
+ -s -v -k "not Flax and not Onnx" \
220
+ --make-reports=tests_torch_lora_cuda \
221
+ --report-log=tests_torch_lora_cuda.log \
222
+ tests/lora
223
+
224
+ - name: Failure short reports
225
+ if: ${{ failure() }}
226
+ run: |
227
+ cat reports/tests_torch_lora_cuda_stats.txt
228
+ cat reports/tests_torch_lora_cuda_failures_short.txt
229
+
230
+ - name: Test suite reports artifacts
231
+ if: ${{ always() }}
232
+ uses: actions/upload-artifact@v2
233
+ with:
234
+ name: torch_lora_cuda_test_reports
235
+ path: reports
236
+
237
+ - name: Generate Report and Notify Channel
238
+ if: always()
239
+ run: |
240
+ pip install slack_sdk tabulate
241
+ python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
242
+
243
+ run_flax_tpu_tests:
244
+ name: Nightly Flax TPU Tests
245
+ runs-on: docker-tpu
246
+ if: github.event_name == 'schedule'
247
+
248
+ container:
249
+ image: diffusers/diffusers-flax-tpu
250
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/ --privileged
251
+ defaults:
252
+ run:
253
+ shell: bash
254
+ steps:
255
+ - name: Checkout diffusers
256
+ uses: actions/checkout@v3
257
+ with:
258
+ fetch-depth: 2
259
+
260
+ - name: Install dependencies
261
+ run: |
262
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
263
+ python -m uv pip install -e [quality,test]
264
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
265
+ python -m uv pip install pytest-reportlog
266
+
267
+ - name: Environment
268
+ run: python utils/print_env.py
269
+
270
+ - name: Run nightly Flax TPU tests
271
+ env:
272
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
273
+ run: |
274
+ python -m pytest -n 0 \
275
+ -s -v -k "Flax" \
276
+ --make-reports=tests_flax_tpu \
277
+ --report-log=tests_flax_tpu.log \
278
+ tests/
279
+
280
+ - name: Failure short reports
281
+ if: ${{ failure() }}
282
+ run: |
283
+ cat reports/tests_flax_tpu_stats.txt
284
+ cat reports/tests_flax_tpu_failures_short.txt
285
+
286
+ - name: Test suite reports artifacts
287
+ if: ${{ always() }}
288
+ uses: actions/upload-artifact@v2
289
+ with:
290
+ name: flax_tpu_test_reports
291
+ path: reports
292
+
293
+ - name: Generate Report and Notify Channel
294
+ if: always()
295
+ run: |
296
+ pip install slack_sdk tabulate
297
+ python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
298
+
299
+ run_nightly_onnx_tests:
300
+ name: Nightly ONNXRuntime CUDA tests on Ubuntu
301
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
302
+ container:
303
+ image: diffusers/diffusers-onnxruntime-cuda
304
+ options: --gpus 0 --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
305
+
306
+ steps:
307
+ - name: Checkout diffusers
308
+ uses: actions/checkout@v3
309
+ with:
310
+ fetch-depth: 2
311
+
312
+ - name: NVIDIA-SMI
313
+ run: nvidia-smi
314
+
315
+ - name: Install dependencies
316
+ run: |
317
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
318
+ python -m uv pip install -e [quality,test]
319
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
320
+ python -m uv pip install pytest-reportlog
321
+
322
+ - name: Environment
323
+ run: python utils/print_env.py
324
+
325
+ - name: Run nightly ONNXRuntime CUDA tests
326
+ env:
327
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
328
+ run: |
329
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
330
+ -s -v -k "Onnx" \
331
+ --make-reports=tests_onnx_cuda \
332
+ --report-log=tests_onnx_cuda.log \
333
+ tests/
334
+
335
+ - name: Failure short reports
336
+ if: ${{ failure() }}
337
+ run: |
338
+ cat reports/tests_onnx_cuda_stats.txt
339
+ cat reports/tests_onnx_cuda_failures_short.txt
340
+
341
+ - name: Test suite reports artifacts
342
+ if: ${{ always() }}
343
+ uses: actions/upload-artifact@v2
344
+ with:
345
+ name: ${{ matrix.config.report }}_test_reports
346
+ path: reports
347
+
348
+ - name: Generate Report and Notify Channel
349
+ if: always()
350
+ run: |
351
+ pip install slack_sdk tabulate
352
+ python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
353
+
354
+ run_nightly_tests_apple_m1:
355
+ name: Nightly PyTorch MPS tests on MacOS
356
+ runs-on: [ self-hosted, apple-m1 ]
357
+ if: github.event_name == 'schedule'
358
+
359
+ steps:
360
+ - name: Checkout diffusers
361
+ uses: actions/checkout@v3
362
+ with:
363
+ fetch-depth: 2
364
+
365
+ - name: Clean checkout
366
+ shell: arch -arch arm64 bash {0}
367
+ run: |
368
+ git clean -fxd
369
+
370
+ - name: Setup miniconda
371
+ uses: ./.github/actions/setup-miniconda
372
+ with:
373
+ python-version: 3.9
374
+
375
+ - name: Install dependencies
376
+ shell: arch -arch arm64 bash {0}
377
+ run: |
378
+ ${CONDA_RUN} python -m pip install --upgrade pip uv
379
+ ${CONDA_RUN} python -m uv pip install -e [quality,test]
380
+ ${CONDA_RUN} python -m uv pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cpu
381
+ ${CONDA_RUN} python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate
382
+ ${CONDA_RUN} python -m uv pip install pytest-reportlog
383
+
384
+ - name: Environment
385
+ shell: arch -arch arm64 bash {0}
386
+ run: |
387
+ ${CONDA_RUN} python utils/print_env.py
388
+
389
+ - name: Run nightly PyTorch tests on M1 (MPS)
390
+ shell: arch -arch arm64 bash {0}
391
+ env:
392
+ HF_HOME: /System/Volumes/Data/mnt/cache
393
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
394
+ run: |
395
+ ${CONDA_RUN} python -m pytest -n 1 -s -v --make-reports=tests_torch_mps \
396
+ --report-log=tests_torch_mps.log \
397
+ tests/
398
+
399
+ - name: Failure short reports
400
+ if: ${{ failure() }}
401
+ run: cat reports/tests_torch_mps_failures_short.txt
402
+
403
+ - name: Test suite reports artifacts
404
+ if: ${{ always() }}
405
+ uses: actions/upload-artifact@v2
406
+ with:
407
+ name: torch_mps_test_reports
408
+ path: reports
409
+
410
+ - name: Generate Report and Notify Channel
411
+ if: always()
412
+ run: |
413
+ pip install slack_sdk tabulate
414
+ python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
diffusers/.github/workflows/notify_slack_about_release.yml ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Notify Slack about a release
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ release:
6
+ types: [published]
7
+
8
+ jobs:
9
+ build:
10
+ runs-on: ubuntu-latest
11
+
12
+ steps:
13
+ - uses: actions/checkout@v3
14
+
15
+ - name: Setup Python
16
+ uses: actions/setup-python@v4
17
+ with:
18
+ python-version: '3.8'
19
+
20
+ - name: Notify Slack about the release
21
+ env:
22
+ SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
23
+ run: pip install requests && python utils/notify_slack_about_release.py
diffusers/.github/workflows/pr_dependency_test.yml ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Run dependency tests
2
+
3
+ on:
4
+ pull_request:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ push:
10
+ branches:
11
+ - main
12
+
13
+ concurrency:
14
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
15
+ cancel-in-progress: true
16
+
17
+ jobs:
18
+ check_dependencies:
19
+ runs-on: ubuntu-latest
20
+ steps:
21
+ - uses: actions/checkout@v3
22
+ - name: Set up Python
23
+ uses: actions/setup-python@v4
24
+ with:
25
+ python-version: "3.8"
26
+ - name: Install dependencies
27
+ run: |
28
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
29
+ python -m pip install --upgrade pip uv
30
+ python -m uv pip install -e .
31
+ python -m uv pip install pytest
32
+ - name: Check for soft dependencies
33
+ run: |
34
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
35
+ pytest tests/others/test_dependencies.py
diffusers/.github/workflows/pr_flax_dependency_test.yml ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Run Flax dependency tests
2
+
3
+ on:
4
+ pull_request:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ push:
10
+ branches:
11
+ - main
12
+
13
+ concurrency:
14
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
15
+ cancel-in-progress: true
16
+
17
+ jobs:
18
+ check_flax_dependencies:
19
+ runs-on: ubuntu-latest
20
+ steps:
21
+ - uses: actions/checkout@v3
22
+ - name: Set up Python
23
+ uses: actions/setup-python@v4
24
+ with:
25
+ python-version: "3.8"
26
+ - name: Install dependencies
27
+ run: |
28
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
29
+ python -m pip install --upgrade pip uv
30
+ python -m uv pip install -e .
31
+ python -m uv pip install "jax[cpu]>=0.2.16,!=0.3.2"
32
+ python -m uv pip install "flax>=0.4.1"
33
+ python -m uv pip install "jaxlib>=0.1.65"
34
+ python -m uv pip install pytest
35
+ - name: Check for soft dependencies
36
+ run: |
37
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
38
+ pytest tests/others/test_dependencies.py
diffusers/.github/workflows/pr_test_fetcher.yml ADDED
@@ -0,0 +1,174 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Fast tests for PRs - Test Fetcher
2
+
3
+ on: workflow_dispatch
4
+
5
+ env:
6
+ DIFFUSERS_IS_CI: yes
7
+ OMP_NUM_THREADS: 4
8
+ MKL_NUM_THREADS: 4
9
+ PYTEST_TIMEOUT: 60
10
+
11
+ concurrency:
12
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
13
+ cancel-in-progress: true
14
+
15
+ jobs:
16
+ setup_pr_tests:
17
+ name: Setup PR Tests
18
+ runs-on: [ self-hosted, intel-cpu, 8-cpu, ci ]
19
+ container:
20
+ image: diffusers/diffusers-pytorch-cpu
21
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
22
+ defaults:
23
+ run:
24
+ shell: bash
25
+ outputs:
26
+ matrix: ${{ steps.set_matrix.outputs.matrix }}
27
+ test_map: ${{ steps.set_matrix.outputs.test_map }}
28
+ steps:
29
+ - name: Checkout diffusers
30
+ uses: actions/checkout@v3
31
+ with:
32
+ fetch-depth: 0
33
+ - name: Install dependencies
34
+ run: |
35
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
36
+ python -m uv pip install -e [quality,test]
37
+ - name: Environment
38
+ run: |
39
+ python utils/print_env.py
40
+ echo $(git --version)
41
+ - name: Fetch Tests
42
+ run: |
43
+ python utils/tests_fetcher.py | tee test_preparation.txt
44
+ - name: Report fetched tests
45
+ uses: actions/upload-artifact@v3
46
+ with:
47
+ name: test_fetched
48
+ path: test_preparation.txt
49
+ - id: set_matrix
50
+ name: Create Test Matrix
51
+ # The `keys` is used as GitHub actions matrix for jobs, i.e. `models`, `pipelines`, etc.
52
+ # The `test_map` is used to get the actual identified test files under each key.
53
+ # If no test to run (so no `test_map.json` file), create a dummy map (empty matrix will fail)
54
+ run: |
55
+ if [ -f test_map.json ]; then
56
+ keys=$(python3 -c 'import json; fp = open("test_map.json"); test_map = json.load(fp); fp.close(); d = list(test_map.keys()); print(json.dumps(d))')
57
+ test_map=$(python3 -c 'import json; fp = open("test_map.json"); test_map = json.load(fp); fp.close(); print(json.dumps(test_map))')
58
+ else
59
+ keys=$(python3 -c 'keys = ["dummy"]; print(keys)')
60
+ test_map=$(python3 -c 'test_map = {"dummy": []}; print(test_map)')
61
+ fi
62
+ echo $keys
63
+ echo $test_map
64
+ echo "matrix=$keys" >> $GITHUB_OUTPUT
65
+ echo "test_map=$test_map" >> $GITHUB_OUTPUT
66
+
67
+ run_pr_tests:
68
+ name: Run PR Tests
69
+ needs: setup_pr_tests
70
+ if: contains(fromJson(needs.setup_pr_tests.outputs.matrix), 'dummy') != true
71
+ strategy:
72
+ fail-fast: false
73
+ max-parallel: 2
74
+ matrix:
75
+ modules: ${{ fromJson(needs.setup_pr_tests.outputs.matrix) }}
76
+ runs-on: [ self-hosted, intel-cpu, 8-cpu, ci ]
77
+ container:
78
+ image: diffusers/diffusers-pytorch-cpu
79
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
80
+ defaults:
81
+ run:
82
+ shell: bash
83
+ steps:
84
+ - name: Checkout diffusers
85
+ uses: actions/checkout@v3
86
+ with:
87
+ fetch-depth: 2
88
+
89
+ - name: Install dependencies
90
+ run: |
91
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
92
+ python -m pip install -e [quality,test]
93
+ python -m pip install accelerate
94
+
95
+ - name: Environment
96
+ run: |
97
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
98
+ python utils/print_env.py
99
+
100
+ - name: Run all selected tests on CPU
101
+ run: |
102
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
103
+ python -m pytest -n 2 --dist=loadfile -v --make-reports=${{ matrix.modules }}_tests_cpu ${{ fromJson(needs.setup_pr_tests.outputs.test_map)[matrix.modules] }}
104
+
105
+ - name: Failure short reports
106
+ if: ${{ failure() }}
107
+ continue-on-error: true
108
+ run: |
109
+ cat reports/${{ matrix.modules }}_tests_cpu_stats.txt
110
+ cat reports/${{ matrix.modules }}_tests_cpu_failures_short.txt
111
+
112
+ - name: Test suite reports artifacts
113
+ if: ${{ always() }}
114
+ uses: actions/upload-artifact@v3
115
+ with:
116
+ name: ${{ matrix.modules }}_test_reports
117
+ path: reports
118
+
119
+ run_staging_tests:
120
+ strategy:
121
+ fail-fast: false
122
+ matrix:
123
+ config:
124
+ - name: Hub tests for models, schedulers, and pipelines
125
+ framework: hub_tests_pytorch
126
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
127
+ image: diffusers/diffusers-pytorch-cpu
128
+ report: torch_hub
129
+
130
+ name: ${{ matrix.config.name }}
131
+ runs-on: ${{ matrix.config.runner }}
132
+ container:
133
+ image: ${{ matrix.config.image }}
134
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
135
+
136
+ defaults:
137
+ run:
138
+ shell: bash
139
+
140
+ steps:
141
+ - name: Checkout diffusers
142
+ uses: actions/checkout@v3
143
+ with:
144
+ fetch-depth: 2
145
+
146
+ - name: Install dependencies
147
+ run: |
148
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
149
+ python -m pip install -e [quality,test]
150
+
151
+ - name: Environment
152
+ run: |
153
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
154
+ python utils/print_env.py
155
+
156
+ - name: Run Hub tests for models, schedulers, and pipelines on a staging env
157
+ if: ${{ matrix.config.framework == 'hub_tests_pytorch' }}
158
+ run: |
159
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
160
+ HUGGINGFACE_CO_STAGING=true python -m pytest \
161
+ -m "is_staging_test" \
162
+ --make-reports=tests_${{ matrix.config.report }} \
163
+ tests
164
+
165
+ - name: Failure short reports
166
+ if: ${{ failure() }}
167
+ run: cat reports/tests_${{ matrix.config.report }}_failures_short.txt
168
+
169
+ - name: Test suite reports artifacts
170
+ if: ${{ always() }}
171
+ uses: actions/upload-artifact@v2
172
+ with:
173
+ name: pr_${{ matrix.config.report }}_test_reports
174
+ path: reports
diffusers/.github/workflows/pr_test_peft_backend.yml ADDED
@@ -0,0 +1,131 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Fast tests for PRs - PEFT backend
2
+
3
+ on:
4
+ pull_request:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ - "tests/**.py"
10
+
11
+ concurrency:
12
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
13
+ cancel-in-progress: true
14
+
15
+ env:
16
+ DIFFUSERS_IS_CI: yes
17
+ OMP_NUM_THREADS: 4
18
+ MKL_NUM_THREADS: 4
19
+ PYTEST_TIMEOUT: 60
20
+
21
+ jobs:
22
+ check_code_quality:
23
+ runs-on: ubuntu-latest
24
+ steps:
25
+ - uses: actions/checkout@v3
26
+ - name: Set up Python
27
+ uses: actions/setup-python@v4
28
+ with:
29
+ python-version: "3.8"
30
+ - name: Install dependencies
31
+ run: |
32
+ python -m pip install --upgrade pip
33
+ pip install .[quality]
34
+ - name: Check quality
35
+ run: make quality
36
+ - name: Check if failure
37
+ if: ${{ failure() }}
38
+ run: |
39
+ echo "Quality check failed. Please ensure the right dependency versions are installed with 'pip install -e .[quality]' and run 'make style && make quality'" >> $GITHUB_STEP_SUMMARY
40
+
41
+ check_repository_consistency:
42
+ needs: check_code_quality
43
+ runs-on: ubuntu-latest
44
+ steps:
45
+ - uses: actions/checkout@v3
46
+ - name: Set up Python
47
+ uses: actions/setup-python@v4
48
+ with:
49
+ python-version: "3.8"
50
+ - name: Install dependencies
51
+ run: |
52
+ python -m pip install --upgrade pip
53
+ pip install .[quality]
54
+ - name: Check repo consistency
55
+ run: |
56
+ python utils/check_copies.py
57
+ python utils/check_dummies.py
58
+ make deps_table_check_updated
59
+ - name: Check if failure
60
+ if: ${{ failure() }}
61
+ run: |
62
+ echo "Repo consistency check failed. Please ensure the right dependency versions are installed with 'pip install -e .[quality]' and run 'make fix-copies'" >> $GITHUB_STEP_SUMMARY
63
+
64
+ run_fast_tests:
65
+ needs: [check_code_quality, check_repository_consistency]
66
+ strategy:
67
+ fail-fast: false
68
+ matrix:
69
+ lib-versions: ["main", "latest"]
70
+
71
+
72
+ name: LoRA - ${{ matrix.lib-versions }}
73
+
74
+ runs-on: [ self-hosted, intel-cpu, 8-cpu, ci ]
75
+
76
+ container:
77
+ image: diffusers/diffusers-pytorch-cpu
78
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
79
+
80
+ defaults:
81
+ run:
82
+ shell: bash
83
+
84
+ steps:
85
+ - name: Checkout diffusers
86
+ uses: actions/checkout@v3
87
+ with:
88
+ fetch-depth: 2
89
+
90
+ - name: Install dependencies
91
+ run: |
92
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
93
+ python -m uv pip install -e [quality,test]
94
+ if [ "${{ matrix.lib-versions }}" == "main" ]; then
95
+ python -m pip install -U peft@git+https://github.com/huggingface/peft.git
96
+ python -m uv pip install -U transformers@git+https://github.com/huggingface/transformers.git
97
+ python -m uv pip install -U accelerate@git+https://github.com/huggingface/accelerate.git
98
+ else
99
+ python -m uv pip install -U peft transformers accelerate
100
+ fi
101
+
102
+ - name: Environment
103
+ run: |
104
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
105
+ python utils/print_env.py
106
+
107
+ - name: Run fast PyTorch LoRA CPU tests with PEFT backend
108
+ run: |
109
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
110
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
111
+ -s -v \
112
+ --make-reports=tests_${{ matrix.config.report }} \
113
+ tests/lora/
114
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
115
+ -s -v \
116
+ --make-reports=tests_models_lora_${{ matrix.config.report }} \
117
+ tests/models/ -k "lora"
118
+
119
+
120
+ - name: Failure short reports
121
+ if: ${{ failure() }}
122
+ run: |
123
+ cat reports/tests_${{ matrix.config.report }}_failures_short.txt
124
+ cat reports/tests_models_lora_${{ matrix.config.report }}_failures_short.txt
125
+
126
+ - name: Test suite reports artifacts
127
+ if: ${{ always() }}
128
+ uses: actions/upload-artifact@v2
129
+ with:
130
+ name: pr_${{ matrix.config.report }}_test_reports
131
+ path: reports
diffusers/.github/workflows/pr_tests.yml ADDED
@@ -0,0 +1,233 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Fast tests for PRs
2
+
3
+ on:
4
+ pull_request:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ - "benchmarks/**.py"
10
+ - "examples/**.py"
11
+ - "scripts/**.py"
12
+ - "tests/**.py"
13
+ - ".github/**.yml"
14
+ - "utils/**.py"
15
+ push:
16
+ branches:
17
+ - ci-*
18
+
19
+ concurrency:
20
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
21
+ cancel-in-progress: true
22
+
23
+ env:
24
+ DIFFUSERS_IS_CI: yes
25
+ OMP_NUM_THREADS: 4
26
+ MKL_NUM_THREADS: 4
27
+ PYTEST_TIMEOUT: 60
28
+
29
+ jobs:
30
+ check_code_quality:
31
+ runs-on: ubuntu-latest
32
+ steps:
33
+ - uses: actions/checkout@v3
34
+ - name: Set up Python
35
+ uses: actions/setup-python@v4
36
+ with:
37
+ python-version: "3.8"
38
+ - name: Install dependencies
39
+ run: |
40
+ python -m pip install --upgrade pip
41
+ pip install .[quality]
42
+ - name: Check quality
43
+ run: make quality
44
+ - name: Check if failure
45
+ if: ${{ failure() }}
46
+ run: |
47
+ echo "Quality check failed. Please ensure the right dependency versions are installed with 'pip install -e .[quality]' and run 'make style && make quality'" >> $GITHUB_STEP_SUMMARY
48
+
49
+ check_repository_consistency:
50
+ needs: check_code_quality
51
+ runs-on: ubuntu-latest
52
+ steps:
53
+ - uses: actions/checkout@v3
54
+ - name: Set up Python
55
+ uses: actions/setup-python@v4
56
+ with:
57
+ python-version: "3.8"
58
+ - name: Install dependencies
59
+ run: |
60
+ python -m pip install --upgrade pip
61
+ pip install .[quality]
62
+ - name: Check repo consistency
63
+ run: |
64
+ python utils/check_copies.py
65
+ python utils/check_dummies.py
66
+ make deps_table_check_updated
67
+ - name: Check if failure
68
+ if: ${{ failure() }}
69
+ run: |
70
+ echo "Repo consistency check failed. Please ensure the right dependency versions are installed with 'pip install -e .[quality]' and run 'make fix-copies'" >> $GITHUB_STEP_SUMMARY
71
+
72
+ run_fast_tests:
73
+ needs: [check_code_quality, check_repository_consistency]
74
+ strategy:
75
+ fail-fast: false
76
+ matrix:
77
+ config:
78
+ - name: Fast PyTorch Pipeline CPU tests
79
+ framework: pytorch_pipelines
80
+ runner: [ self-hosted, intel-cpu, 32-cpu, 256-ram, ci ]
81
+ image: diffusers/diffusers-pytorch-cpu
82
+ report: torch_cpu_pipelines
83
+ - name: Fast PyTorch Models & Schedulers CPU tests
84
+ framework: pytorch_models
85
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
86
+ image: diffusers/diffusers-pytorch-cpu
87
+ report: torch_cpu_models_schedulers
88
+ - name: Fast Flax CPU tests
89
+ framework: flax
90
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
91
+ image: diffusers/diffusers-flax-cpu
92
+ report: flax_cpu
93
+ - name: PyTorch Example CPU tests
94
+ framework: pytorch_examples
95
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
96
+ image: diffusers/diffusers-pytorch-cpu
97
+ report: torch_example_cpu
98
+
99
+ name: ${{ matrix.config.name }}
100
+
101
+ runs-on: ${{ matrix.config.runner }}
102
+
103
+ container:
104
+ image: ${{ matrix.config.image }}
105
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
106
+
107
+ defaults:
108
+ run:
109
+ shell: bash
110
+
111
+ steps:
112
+ - name: Checkout diffusers
113
+ uses: actions/checkout@v3
114
+ with:
115
+ fetch-depth: 2
116
+
117
+ - name: Install dependencies
118
+ run: |
119
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
120
+ python -m uv pip install -e [quality,test]
121
+ python -m uv pip install accelerate
122
+
123
+ - name: Environment
124
+ run: |
125
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
126
+ python utils/print_env.py
127
+
128
+ - name: Run fast PyTorch Pipeline CPU tests
129
+ if: ${{ matrix.config.framework == 'pytorch_pipelines' }}
130
+ run: |
131
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
132
+ python -m pytest -n 8 --max-worker-restart=0 --dist=loadfile \
133
+ -s -v -k "not Flax and not Onnx" \
134
+ --make-reports=tests_${{ matrix.config.report }} \
135
+ tests/pipelines
136
+
137
+ - name: Run fast PyTorch Model Scheduler CPU tests
138
+ if: ${{ matrix.config.framework == 'pytorch_models' }}
139
+ run: |
140
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
141
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
142
+ -s -v -k "not Flax and not Onnx and not Dependency" \
143
+ --make-reports=tests_${{ matrix.config.report }} \
144
+ tests/models tests/schedulers tests/others
145
+
146
+ - name: Run fast Flax TPU tests
147
+ if: ${{ matrix.config.framework == 'flax' }}
148
+ run: |
149
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
150
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
151
+ -s -v -k "Flax" \
152
+ --make-reports=tests_${{ matrix.config.report }} \
153
+ tests
154
+
155
+ - name: Run example PyTorch CPU tests
156
+ if: ${{ matrix.config.framework == 'pytorch_examples' }}
157
+ run: |
158
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
159
+ python -m uv pip install peft timm
160
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
161
+ --make-reports=tests_${{ matrix.config.report }} \
162
+ examples
163
+
164
+ - name: Failure short reports
165
+ if: ${{ failure() }}
166
+ run: cat reports/tests_${{ matrix.config.report }}_failures_short.txt
167
+
168
+ - name: Test suite reports artifacts
169
+ if: ${{ always() }}
170
+ uses: actions/upload-artifact@v2
171
+ with:
172
+ name: pr_${{ matrix.config.report }}_test_reports
173
+ path: reports
174
+
175
+ run_staging_tests:
176
+ needs: [check_code_quality, check_repository_consistency]
177
+ strategy:
178
+ fail-fast: false
179
+ matrix:
180
+ config:
181
+ - name: Hub tests for models, schedulers, and pipelines
182
+ framework: hub_tests_pytorch
183
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
184
+ image: diffusers/diffusers-pytorch-cpu
185
+ report: torch_hub
186
+
187
+ name: ${{ matrix.config.name }}
188
+
189
+ runs-on: ${{ matrix.config.runner }}
190
+
191
+ container:
192
+ image: ${{ matrix.config.image }}
193
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
194
+
195
+ defaults:
196
+ run:
197
+ shell: bash
198
+
199
+ steps:
200
+ - name: Checkout diffusers
201
+ uses: actions/checkout@v3
202
+ with:
203
+ fetch-depth: 2
204
+
205
+ - name: Install dependencies
206
+ run: |
207
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
208
+ python -m uv pip install -e [quality,test]
209
+
210
+ - name: Environment
211
+ run: |
212
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
213
+ python utils/print_env.py
214
+
215
+ - name: Run Hub tests for models, schedulers, and pipelines on a staging env
216
+ if: ${{ matrix.config.framework == 'hub_tests_pytorch' }}
217
+ run: |
218
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
219
+ HUGGINGFACE_CO_STAGING=true python -m pytest \
220
+ -m "is_staging_test" \
221
+ --make-reports=tests_${{ matrix.config.report }} \
222
+ tests
223
+
224
+ - name: Failure short reports
225
+ if: ${{ failure() }}
226
+ run: cat reports/tests_${{ matrix.config.report }}_failures_short.txt
227
+
228
+ - name: Test suite reports artifacts
229
+ if: ${{ always() }}
230
+ uses: actions/upload-artifact@v2
231
+ with:
232
+ name: pr_${{ matrix.config.report }}_test_reports
233
+ path: reports
diffusers/.github/workflows/pr_torch_dependency_test.yml ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Run Torch dependency tests
2
+
3
+ on:
4
+ pull_request:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ push:
10
+ branches:
11
+ - main
12
+
13
+ concurrency:
14
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
15
+ cancel-in-progress: true
16
+
17
+ jobs:
18
+ check_torch_dependencies:
19
+ runs-on: ubuntu-latest
20
+ steps:
21
+ - uses: actions/checkout@v3
22
+ - name: Set up Python
23
+ uses: actions/setup-python@v4
24
+ with:
25
+ python-version: "3.8"
26
+ - name: Install dependencies
27
+ run: |
28
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
29
+ python -m pip install --upgrade pip uv
30
+ python -m uv pip install -e .
31
+ python -m uv pip install torch torchvision torchaudio
32
+ python -m uv pip install pytest
33
+ - name: Check for soft dependencies
34
+ run: |
35
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
36
+ pytest tests/others/test_dependencies.py
diffusers/.github/workflows/push_tests.yml ADDED
@@ -0,0 +1,436 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Slow Tests on main
2
+
3
+ on:
4
+ push:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ - "examples/**.py"
10
+ - "tests/**.py"
11
+
12
+ env:
13
+ DIFFUSERS_IS_CI: yes
14
+ HF_HOME: /mnt/cache
15
+ OMP_NUM_THREADS: 8
16
+ MKL_NUM_THREADS: 8
17
+ PYTEST_TIMEOUT: 600
18
+ RUN_SLOW: yes
19
+ PIPELINE_USAGE_CUTOFF: 50000
20
+
21
+ jobs:
22
+ setup_torch_cuda_pipeline_matrix:
23
+ name: Setup Torch Pipelines CUDA Slow Tests Matrix
24
+ runs-on: [ self-hosted, intel-cpu, 8-cpu, ci ]
25
+ container:
26
+ image: diffusers/diffusers-pytorch-cpu
27
+ outputs:
28
+ pipeline_test_matrix: ${{ steps.fetch_pipeline_matrix.outputs.pipeline_test_matrix }}
29
+ steps:
30
+ - name: Checkout diffusers
31
+ uses: actions/checkout@v3
32
+ with:
33
+ fetch-depth: 2
34
+ - name: Install dependencies
35
+ run: |
36
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
37
+ python -m uv pip install -e [quality,test]
38
+ - name: Environment
39
+ run: |
40
+ python utils/print_env.py
41
+ - name: Fetch Pipeline Matrix
42
+ id: fetch_pipeline_matrix
43
+ run: |
44
+ matrix=$(python utils/fetch_torch_cuda_pipeline_test_matrix.py)
45
+ echo $matrix
46
+ echo "pipeline_test_matrix=$matrix" >> $GITHUB_OUTPUT
47
+ - name: Pipeline Tests Artifacts
48
+ if: ${{ always() }}
49
+ uses: actions/upload-artifact@v2
50
+ with:
51
+ name: test-pipelines.json
52
+ path: reports
53
+
54
+ torch_pipelines_cuda_tests:
55
+ name: Torch Pipelines CUDA Slow Tests
56
+ needs: setup_torch_cuda_pipeline_matrix
57
+ strategy:
58
+ fail-fast: false
59
+ max-parallel: 8
60
+ matrix:
61
+ module: ${{ fromJson(needs.setup_torch_cuda_pipeline_matrix.outputs.pipeline_test_matrix) }}
62
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
63
+ container:
64
+ image: diffusers/diffusers-pytorch-cuda
65
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface/diffusers:/mnt/cache/ --gpus 0
66
+ steps:
67
+ - name: Checkout diffusers
68
+ uses: actions/checkout@v3
69
+ with:
70
+ fetch-depth: 2
71
+ - name: NVIDIA-SMI
72
+ run: |
73
+ nvidia-smi
74
+ - name: Install dependencies
75
+ run: |
76
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
77
+ python -m uv pip install -e [quality,test]
78
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
79
+ - name: Environment
80
+ run: |
81
+ python utils/print_env.py
82
+ - name: Slow PyTorch CUDA checkpoint tests on Ubuntu
83
+ env:
84
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
85
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
86
+ CUBLAS_WORKSPACE_CONFIG: :16:8
87
+ run: |
88
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
89
+ -s -v -k "not Flax and not Onnx" \
90
+ --make-reports=tests_pipeline_${{ matrix.module }}_cuda \
91
+ tests/pipelines/${{ matrix.module }}
92
+ - name: Failure short reports
93
+ if: ${{ failure() }}
94
+ run: |
95
+ cat reports/tests_pipeline_${{ matrix.module }}_cuda_stats.txt
96
+ cat reports/tests_pipeline_${{ matrix.module }}_cuda_failures_short.txt
97
+ - name: Test suite reports artifacts
98
+ if: ${{ always() }}
99
+ uses: actions/upload-artifact@v2
100
+ with:
101
+ name: pipeline_${{ matrix.module }}_test_reports
102
+ path: reports
103
+
104
+ torch_cuda_tests:
105
+ name: Torch CUDA Tests
106
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
107
+ container:
108
+ image: diffusers/diffusers-pytorch-cuda
109
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface/diffusers:/mnt/cache/ --gpus 0
110
+ defaults:
111
+ run:
112
+ shell: bash
113
+ strategy:
114
+ matrix:
115
+ module: [models, schedulers, lora, others, single_file]
116
+ steps:
117
+ - name: Checkout diffusers
118
+ uses: actions/checkout@v3
119
+ with:
120
+ fetch-depth: 2
121
+
122
+ - name: Install dependencies
123
+ run: |
124
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
125
+ python -m uv pip install -e [quality,test]
126
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
127
+
128
+ - name: Environment
129
+ run: |
130
+ python utils/print_env.py
131
+
132
+ - name: Run slow PyTorch CUDA tests
133
+ env:
134
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
135
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
136
+ CUBLAS_WORKSPACE_CONFIG: :16:8
137
+ run: |
138
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
139
+ -s -v -k "not Flax and not Onnx" \
140
+ --make-reports=tests_torch_cuda \
141
+ tests/${{ matrix.module }}
142
+
143
+ - name: Failure short reports
144
+ if: ${{ failure() }}
145
+ run: |
146
+ cat reports/tests_torch_cuda_stats.txt
147
+ cat reports/tests_torch_cuda_failures_short.txt
148
+
149
+ - name: Test suite reports artifacts
150
+ if: ${{ always() }}
151
+ uses: actions/upload-artifact@v2
152
+ with:
153
+ name: torch_cuda_test_reports
154
+ path: reports
155
+
156
+ peft_cuda_tests:
157
+ name: PEFT CUDA Tests
158
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
159
+ container:
160
+ image: diffusers/diffusers-pytorch-cuda
161
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface/diffusers:/mnt/cache/ --gpus 0
162
+ defaults:
163
+ run:
164
+ shell: bash
165
+ steps:
166
+ - name: Checkout diffusers
167
+ uses: actions/checkout@v3
168
+ with:
169
+ fetch-depth: 2
170
+
171
+ - name: Install dependencies
172
+ run: |
173
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
174
+ python -m uv pip install -e [quality,test]
175
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
176
+ python -m pip install -U peft@git+https://github.com/huggingface/peft.git
177
+
178
+ - name: Environment
179
+ run: |
180
+ python utils/print_env.py
181
+
182
+ - name: Run slow PEFT CUDA tests
183
+ env:
184
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
185
+ # https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
186
+ CUBLAS_WORKSPACE_CONFIG: :16:8
187
+ run: |
188
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
189
+ -s -v -k "not Flax and not Onnx and not PEFTLoRALoading" \
190
+ --make-reports=tests_peft_cuda \
191
+ tests/lora/
192
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
193
+ -s -v -k "lora and not Flax and not Onnx and not PEFTLoRALoading" \
194
+ --make-reports=tests_peft_cuda_models_lora \
195
+ tests/models/
196
+
197
+ - name: Failure short reports
198
+ if: ${{ failure() }}
199
+ run: |
200
+ cat reports/tests_peft_cuda_stats.txt
201
+ cat reports/tests_peft_cuda_failures_short.txt
202
+ cat reports/tests_peft_cuda_models_lora_failures_short.txt
203
+
204
+ - name: Test suite reports artifacts
205
+ if: ${{ always() }}
206
+ uses: actions/upload-artifact@v2
207
+ with:
208
+ name: torch_peft_test_reports
209
+ path: reports
210
+
211
+ flax_tpu_tests:
212
+ name: Flax TPU Tests
213
+ runs-on: docker-tpu
214
+ container:
215
+ image: diffusers/diffusers-flax-tpu
216
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface:/mnt/cache/ --privileged
217
+ defaults:
218
+ run:
219
+ shell: bash
220
+ steps:
221
+ - name: Checkout diffusers
222
+ uses: actions/checkout@v3
223
+ with:
224
+ fetch-depth: 2
225
+
226
+ - name: Install dependencies
227
+ run: |
228
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
229
+ python -m uv pip install -e [quality,test]
230
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
231
+
232
+ - name: Environment
233
+ run: |
234
+ python utils/print_env.py
235
+
236
+ - name: Run slow Flax TPU tests
237
+ env:
238
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
239
+ run: |
240
+ python -m pytest -n 0 \
241
+ -s -v -k "Flax" \
242
+ --make-reports=tests_flax_tpu \
243
+ tests/
244
+
245
+ - name: Failure short reports
246
+ if: ${{ failure() }}
247
+ run: |
248
+ cat reports/tests_flax_tpu_stats.txt
249
+ cat reports/tests_flax_tpu_failures_short.txt
250
+
251
+ - name: Test suite reports artifacts
252
+ if: ${{ always() }}
253
+ uses: actions/upload-artifact@v2
254
+ with:
255
+ name: flax_tpu_test_reports
256
+ path: reports
257
+
258
+ onnx_cuda_tests:
259
+ name: ONNX CUDA Tests
260
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
261
+ container:
262
+ image: diffusers/diffusers-onnxruntime-cuda
263
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface:/mnt/cache/ --gpus 0
264
+ defaults:
265
+ run:
266
+ shell: bash
267
+ steps:
268
+ - name: Checkout diffusers
269
+ uses: actions/checkout@v3
270
+ with:
271
+ fetch-depth: 2
272
+
273
+ - name: Install dependencies
274
+ run: |
275
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
276
+ python -m uv pip install -e [quality,test]
277
+ python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
278
+
279
+ - name: Environment
280
+ run: |
281
+ python utils/print_env.py
282
+
283
+ - name: Run slow ONNXRuntime CUDA tests
284
+ env:
285
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
286
+ run: |
287
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
288
+ -s -v -k "Onnx" \
289
+ --make-reports=tests_onnx_cuda \
290
+ tests/
291
+
292
+ - name: Failure short reports
293
+ if: ${{ failure() }}
294
+ run: |
295
+ cat reports/tests_onnx_cuda_stats.txt
296
+ cat reports/tests_onnx_cuda_failures_short.txt
297
+
298
+ - name: Test suite reports artifacts
299
+ if: ${{ always() }}
300
+ uses: actions/upload-artifact@v2
301
+ with:
302
+ name: onnx_cuda_test_reports
303
+ path: reports
304
+
305
+ run_torch_compile_tests:
306
+ name: PyTorch Compile CUDA tests
307
+
308
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
309
+
310
+ container:
311
+ image: diffusers/diffusers-pytorch-compile-cuda
312
+ options: --gpus 0 --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface:/mnt/cache/
313
+
314
+ steps:
315
+ - name: Checkout diffusers
316
+ uses: actions/checkout@v3
317
+ with:
318
+ fetch-depth: 2
319
+
320
+ - name: NVIDIA-SMI
321
+ run: |
322
+ nvidia-smi
323
+ - name: Install dependencies
324
+ run: |
325
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
326
+ python -m uv pip install -e [quality,test,training]
327
+ - name: Environment
328
+ run: |
329
+ python utils/print_env.py
330
+ - name: Run example tests on GPU
331
+ env:
332
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
333
+ RUN_COMPILE: yes
334
+ run: |
335
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile -s -v -k "compile" --make-reports=tests_torch_compile_cuda tests/
336
+ - name: Failure short reports
337
+ if: ${{ failure() }}
338
+ run: cat reports/tests_torch_compile_cuda_failures_short.txt
339
+
340
+ - name: Test suite reports artifacts
341
+ if: ${{ always() }}
342
+ uses: actions/upload-artifact@v2
343
+ with:
344
+ name: torch_compile_test_reports
345
+ path: reports
346
+
347
+ run_xformers_tests:
348
+ name: PyTorch xformers CUDA tests
349
+
350
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
351
+
352
+ container:
353
+ image: diffusers/diffusers-pytorch-xformers-cuda
354
+ options: --gpus 0 --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface:/mnt/cache/
355
+
356
+ steps:
357
+ - name: Checkout diffusers
358
+ uses: actions/checkout@v3
359
+ with:
360
+ fetch-depth: 2
361
+
362
+ - name: NVIDIA-SMI
363
+ run: |
364
+ nvidia-smi
365
+ - name: Install dependencies
366
+ run: |
367
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
368
+ python -m uv pip install -e [quality,test,training]
369
+ - name: Environment
370
+ run: |
371
+ python utils/print_env.py
372
+ - name: Run example tests on GPU
373
+ env:
374
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
375
+ run: |
376
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile -s -v -k "xformers" --make-reports=tests_torch_xformers_cuda tests/
377
+ - name: Failure short reports
378
+ if: ${{ failure() }}
379
+ run: cat reports/tests_torch_xformers_cuda_failures_short.txt
380
+
381
+ - name: Test suite reports artifacts
382
+ if: ${{ always() }}
383
+ uses: actions/upload-artifact@v2
384
+ with:
385
+ name: torch_xformers_test_reports
386
+ path: reports
387
+
388
+ run_examples_tests:
389
+ name: Examples PyTorch CUDA tests on Ubuntu
390
+
391
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
392
+
393
+ container:
394
+ image: diffusers/diffusers-pytorch-cuda
395
+ options: --gpus 0 --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface:/mnt/cache/
396
+
397
+ steps:
398
+ - name: Checkout diffusers
399
+ uses: actions/checkout@v3
400
+ with:
401
+ fetch-depth: 2
402
+
403
+ - name: NVIDIA-SMI
404
+ run: |
405
+ nvidia-smi
406
+
407
+ - name: Install dependencies
408
+ run: |
409
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
410
+ python -m uv pip install -e [quality,test,training]
411
+
412
+ - name: Environment
413
+ run: |
414
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
415
+ python utils/print_env.py
416
+
417
+ - name: Run example tests on GPU
418
+ env:
419
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
420
+ run: |
421
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
422
+ python -m uv pip install timm
423
+ python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile -s -v --make-reports=examples_torch_cuda examples/
424
+
425
+ - name: Failure short reports
426
+ if: ${{ failure() }}
427
+ run: |
428
+ cat reports/examples_torch_cuda_stats.txt
429
+ cat reports/examples_torch_cuda_failures_short.txt
430
+
431
+ - name: Test suite reports artifacts
432
+ if: ${{ always() }}
433
+ uses: actions/upload-artifact@v2
434
+ with:
435
+ name: examples_test_reports
436
+ path: reports
diffusers/.github/workflows/push_tests_fast.yml ADDED
@@ -0,0 +1,124 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Fast tests on main
2
+
3
+ on:
4
+ push:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ - "examples/**.py"
10
+ - "tests/**.py"
11
+
12
+ concurrency:
13
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
14
+ cancel-in-progress: true
15
+
16
+ env:
17
+ DIFFUSERS_IS_CI: yes
18
+ HF_HOME: /mnt/cache
19
+ OMP_NUM_THREADS: 8
20
+ MKL_NUM_THREADS: 8
21
+ PYTEST_TIMEOUT: 600
22
+ RUN_SLOW: no
23
+
24
+ jobs:
25
+ run_fast_tests:
26
+ strategy:
27
+ fail-fast: false
28
+ matrix:
29
+ config:
30
+ - name: Fast PyTorch CPU tests on Ubuntu
31
+ framework: pytorch
32
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
33
+ image: diffusers/diffusers-pytorch-cpu
34
+ report: torch_cpu
35
+ - name: Fast Flax CPU tests on Ubuntu
36
+ framework: flax
37
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
38
+ image: diffusers/diffusers-flax-cpu
39
+ report: flax_cpu
40
+ - name: Fast ONNXRuntime CPU tests on Ubuntu
41
+ framework: onnxruntime
42
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
43
+ image: diffusers/diffusers-onnxruntime-cpu
44
+ report: onnx_cpu
45
+ - name: PyTorch Example CPU tests on Ubuntu
46
+ framework: pytorch_examples
47
+ runner: [ self-hosted, intel-cpu, 8-cpu, ci ]
48
+ image: diffusers/diffusers-pytorch-cpu
49
+ report: torch_example_cpu
50
+
51
+ name: ${{ matrix.config.name }}
52
+
53
+ runs-on: ${{ matrix.config.runner }}
54
+
55
+ container:
56
+ image: ${{ matrix.config.image }}
57
+ options: --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
58
+
59
+ defaults:
60
+ run:
61
+ shell: bash
62
+
63
+ steps:
64
+ - name: Checkout diffusers
65
+ uses: actions/checkout@v3
66
+ with:
67
+ fetch-depth: 2
68
+
69
+ - name: Install dependencies
70
+ run: |
71
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
72
+ python -m uv pip install -e [quality,test]
73
+
74
+ - name: Environment
75
+ run: |
76
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
77
+ python utils/print_env.py
78
+
79
+ - name: Run fast PyTorch CPU tests
80
+ if: ${{ matrix.config.framework == 'pytorch' }}
81
+ run: |
82
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
83
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
84
+ -s -v -k "not Flax and not Onnx" \
85
+ --make-reports=tests_${{ matrix.config.report }} \
86
+ tests/
87
+
88
+ - name: Run fast Flax TPU tests
89
+ if: ${{ matrix.config.framework == 'flax' }}
90
+ run: |
91
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
92
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
93
+ -s -v -k "Flax" \
94
+ --make-reports=tests_${{ matrix.config.report }} \
95
+ tests/
96
+
97
+ - name: Run fast ONNXRuntime CPU tests
98
+ if: ${{ matrix.config.framework == 'onnxruntime' }}
99
+ run: |
100
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
101
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
102
+ -s -v -k "Onnx" \
103
+ --make-reports=tests_${{ matrix.config.report }} \
104
+ tests/
105
+
106
+ - name: Run example PyTorch CPU tests
107
+ if: ${{ matrix.config.framework == 'pytorch_examples' }}
108
+ run: |
109
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
110
+ python -m uv pip install peft timm
111
+ python -m pytest -n 4 --max-worker-restart=0 --dist=loadfile \
112
+ --make-reports=tests_${{ matrix.config.report }} \
113
+ examples
114
+
115
+ - name: Failure short reports
116
+ if: ${{ failure() }}
117
+ run: cat reports/tests_${{ matrix.config.report }}_failures_short.txt
118
+
119
+ - name: Test suite reports artifacts
120
+ if: ${{ always() }}
121
+ uses: actions/upload-artifact@v2
122
+ with:
123
+ name: pr_${{ matrix.config.report }}_test_reports
124
+ path: reports
diffusers/.github/workflows/push_tests_mps.yml ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Fast mps tests on main
2
+
3
+ on:
4
+ push:
5
+ branches:
6
+ - main
7
+ paths:
8
+ - "src/diffusers/**.py"
9
+ - "tests/**.py"
10
+
11
+ env:
12
+ DIFFUSERS_IS_CI: yes
13
+ HF_HOME: /mnt/cache
14
+ OMP_NUM_THREADS: 8
15
+ MKL_NUM_THREADS: 8
16
+ PYTEST_TIMEOUT: 600
17
+ RUN_SLOW: no
18
+
19
+ concurrency:
20
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
21
+ cancel-in-progress: true
22
+
23
+ jobs:
24
+ run_fast_tests_apple_m1:
25
+ name: Fast PyTorch MPS tests on MacOS
26
+ runs-on: macos-13-xlarge
27
+
28
+ steps:
29
+ - name: Checkout diffusers
30
+ uses: actions/checkout@v3
31
+ with:
32
+ fetch-depth: 2
33
+
34
+ - name: Clean checkout
35
+ shell: arch -arch arm64 bash {0}
36
+ run: |
37
+ git clean -fxd
38
+
39
+ - name: Setup miniconda
40
+ uses: ./.github/actions/setup-miniconda
41
+ with:
42
+ python-version: 3.9
43
+
44
+ - name: Install dependencies
45
+ shell: arch -arch arm64 bash {0}
46
+ run: |
47
+ ${CONDA_RUN} python -m pip install --upgrade pip uv
48
+ ${CONDA_RUN} python -m uv pip install -e [quality,test]
49
+ ${CONDA_RUN} python -m uv pip install torch torchvision torchaudio
50
+ ${CONDA_RUN} python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
51
+ ${CONDA_RUN} python -m uv pip install transformers --upgrade
52
+
53
+ - name: Environment
54
+ shell: arch -arch arm64 bash {0}
55
+ run: |
56
+ ${CONDA_RUN} python utils/print_env.py
57
+
58
+ - name: Run fast PyTorch tests on M1 (MPS)
59
+ shell: arch -arch arm64 bash {0}
60
+ env:
61
+ HF_HOME: /System/Volumes/Data/mnt/cache
62
+ HF_TOKEN: ${{ secrets.HF_TOKEN }}
63
+ run: |
64
+ ${CONDA_RUN} python -m pytest -n 0 -s -v --make-reports=tests_torch_mps tests/
65
+
66
+ - name: Failure short reports
67
+ if: ${{ failure() }}
68
+ run: cat reports/tests_torch_mps_failures_short.txt
69
+
70
+ - name: Test suite reports artifacts
71
+ if: ${{ always() }}
72
+ uses: actions/upload-artifact@v2
73
+ with:
74
+ name: pr_torch_mps_test_reports
75
+ path: reports
diffusers/.github/workflows/pypi_publish.yaml ADDED
@@ -0,0 +1,81 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Adapted from https://blog.deepjyoti30.dev/pypi-release-github-action
2
+
3
+ name: PyPI release
4
+
5
+ on:
6
+ workflow_dispatch:
7
+ push:
8
+ tags:
9
+ - "*"
10
+
11
+ jobs:
12
+ find-and-checkout-latest-branch:
13
+ runs-on: ubuntu-latest
14
+ outputs:
15
+ latest_branch: ${{ steps.set_latest_branch.outputs.latest_branch }}
16
+ steps:
17
+ - name: Checkout Repo
18
+ uses: actions/checkout@v3
19
+
20
+ - name: Set up Python
21
+ uses: actions/setup-python@v4
22
+ with:
23
+ python-version: '3.8'
24
+
25
+ - name: Fetch latest branch
26
+ id: fetch_latest_branch
27
+ run: |
28
+ pip install -U requests packaging
29
+ LATEST_BRANCH=$(python utils/fetch_latest_release_branch.py)
30
+ echo "Latest branch: $LATEST_BRANCH"
31
+ echo "latest_branch=$LATEST_BRANCH" >> $GITHUB_ENV
32
+
33
+ - name: Set latest branch output
34
+ id: set_latest_branch
35
+ run: echo "::set-output name=latest_branch::${{ env.latest_branch }}"
36
+
37
+ release:
38
+ needs: find-and-checkout-latest-branch
39
+ runs-on: ubuntu-latest
40
+
41
+ steps:
42
+ - name: Checkout Repo
43
+ uses: actions/checkout@v3
44
+ with:
45
+ ref: ${{ needs.find-and-checkout-latest-branch.outputs.latest_branch }}
46
+
47
+ - name: Setup Python
48
+ uses: actions/setup-python@v4
49
+ with:
50
+ python-version: "3.8"
51
+
52
+ - name: Install dependencies
53
+ run: |
54
+ python -m pip install --upgrade pip
55
+ pip install -U setuptools wheel twine
56
+ pip install -U torch --index-url https://download.pytorch.org/whl/cpu
57
+ pip install -U transformers
58
+
59
+ - name: Build the dist files
60
+ run: python setup.py bdist_wheel && python setup.py sdist
61
+
62
+ - name: Publish to the test PyPI
63
+ env:
64
+ TWINE_USERNAME: ${{ secrets.TEST_PYPI_USERNAME }}
65
+ TWINE_PASSWORD: ${{ secrets.TEST_PYPI_PASSWORD }}
66
+ run: twine upload dist/* -r pypitest --repository-url=https://test.pypi.org/legacy/
67
+
68
+ - name: Test installing diffusers and importing
69
+ run: |
70
+ pip install diffusers && pip uninstall diffusers -y
71
+ pip install -i https://testpypi.python.org/pypi diffusers
72
+ python -c "from diffusers import __version__; print(__version__)"
73
+ python -c "from diffusers import DiffusionPipeline; pipe = DiffusionPipeline.from_pretrained('fusing/unet-ldm-dummy-update'); pipe()"
74
+ python -c "from diffusers import DiffusionPipeline; pipe = DiffusionPipeline.from_pretrained('hf-internal-testing/tiny-stable-diffusion-pipe', safety_checker=None); pipe('ah suh du')"
75
+ python -c "from diffusers import *"
76
+
77
+ - name: Publish to PyPI
78
+ env:
79
+ TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
80
+ TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
81
+ run: twine upload dist/* -r pypi
diffusers/.github/workflows/run_tests_from_a_pr.yml ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Check running SLOW tests from a PR (only GPU)
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ inputs:
6
+ docker_image:
7
+ default: 'diffusers/diffusers-pytorch-cuda'
8
+ description: 'Name of the Docker image'
9
+ required: true
10
+ branch:
11
+ description: 'PR Branch to test on'
12
+ required: true
13
+ test:
14
+ description: 'Tests to run (e.g.: `tests/models`).'
15
+ required: true
16
+
17
+ env:
18
+ DIFFUSERS_IS_CI: yes
19
+ IS_GITHUB_CI: "1"
20
+ HF_HOME: /mnt/cache
21
+ OMP_NUM_THREADS: 8
22
+ MKL_NUM_THREADS: 8
23
+ PYTEST_TIMEOUT: 600
24
+ RUN_SLOW: yes
25
+
26
+ jobs:
27
+ run_tests:
28
+ name: "Run a test on our runner from a PR"
29
+ runs-on: [single-gpu, nvidia-gpu, t4, ci]
30
+ container:
31
+ image: ${{ github.event.inputs.docker_image }}
32
+ options: --gpus 0 --privileged --ipc host -v /mnt/cache/.cache/huggingface:/mnt/cache/
33
+
34
+ steps:
35
+ - name: Validate test files input
36
+ id: validate_test_files
37
+ env:
38
+ PY_TEST: ${{ github.event.inputs.test }}
39
+ run: |
40
+ if [[ ! "$PY_TEST" =~ ^tests/ ]]; then
41
+ echo "Error: The input string must start with 'tests/'."
42
+ exit 1
43
+ fi
44
+
45
+ if [[ ! "$PY_TEST" =~ ^tests/(models|pipelines) ]]; then
46
+ echo "Error: The input string must contain either 'models' or 'pipelines' after 'tests/'."
47
+ exit 1
48
+ fi
49
+
50
+ if [[ "$PY_TEST" == *";"* ]]; then
51
+ echo "Error: The input string must not contain ';'."
52
+ exit 1
53
+ fi
54
+ echo "$PY_TEST"
55
+
56
+ - name: Checkout PR branch
57
+ uses: actions/checkout@v4
58
+ with:
59
+ ref: ${{ github.event.inputs.branch }}
60
+ repository: ${{ github.event.pull_request.head.repo.full_name }}
61
+
62
+
63
+ - name: Install pytest
64
+ run: |
65
+ python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
66
+ python -m uv pip install -e [quality,test]
67
+ python -m uv pip install peft
68
+
69
+ - name: Run tests
70
+ env:
71
+ PY_TEST: ${{ github.event.inputs.test }}
72
+ run: |
73
+ pytest "$PY_TEST"
diffusers/.github/workflows/ssh-pr-runner.yml ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: SSH into PR runners
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ inputs:
6
+ docker_image:
7
+ description: 'Name of the Docker image'
8
+ required: true
9
+
10
+ env:
11
+ IS_GITHUB_CI: "1"
12
+ HF_HUB_READ_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
13
+ HF_HOME: /mnt/cache
14
+ DIFFUSERS_IS_CI: yes
15
+ OMP_NUM_THREADS: 8
16
+ MKL_NUM_THREADS: 8
17
+ RUN_SLOW: yes
18
+
19
+ jobs:
20
+ ssh_runner:
21
+ name: "SSH"
22
+ runs-on: [self-hosted, intel-cpu, 32-cpu, 256-ram, ci]
23
+ container:
24
+ image: ${{ github.event.inputs.docker_image }}
25
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface/diffusers:/mnt/cache/ --privileged
26
+
27
+ steps:
28
+ - name: Checkout diffusers
29
+ uses: actions/checkout@v3
30
+ with:
31
+ fetch-depth: 2
32
+
33
+ - name: Tailscale # In order to be able to SSH when a test fails
34
+ uses: huggingface/tailscale-action@main
35
+ with:
36
+ authkey: ${{ secrets.TAILSCALE_SSH_AUTHKEY }}
37
+ slackChannel: ${{ secrets.SLACK_CIFEEDBACK_CHANNEL }}
38
+ slackToken: ${{ secrets.SLACK_CIFEEDBACK_BOT_TOKEN }}
39
+ waitForSSH: true
diffusers/.github/workflows/ssh-runner.yml ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: SSH into GPU runners
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ inputs:
6
+ runner_type:
7
+ description: 'Type of runner to test (a10 or t4)'
8
+ required: true
9
+ docker_image:
10
+ description: 'Name of the Docker image'
11
+ required: true
12
+
13
+ env:
14
+ IS_GITHUB_CI: "1"
15
+ HF_HUB_READ_TOKEN: ${{ secrets.HF_HUB_READ_TOKEN }}
16
+ HF_HOME: /mnt/cache
17
+ DIFFUSERS_IS_CI: yes
18
+ OMP_NUM_THREADS: 8
19
+ MKL_NUM_THREADS: 8
20
+ RUN_SLOW: yes
21
+
22
+ jobs:
23
+ ssh_runner:
24
+ name: "SSH"
25
+ runs-on: [single-gpu, nvidia-gpu, "${{ github.event.inputs.runner_type }}", ci]
26
+ container:
27
+ image: ${{ github.event.inputs.docker_image }}
28
+ options: --shm-size "16gb" --ipc host -v /mnt/cache/.cache/huggingface/diffusers:/mnt/cache/ --gpus 0 --privileged
29
+
30
+ steps:
31
+ - name: Checkout diffusers
32
+ uses: actions/checkout@v3
33
+ with:
34
+ fetch-depth: 2
35
+
36
+ - name: NVIDIA-SMI
37
+ run: |
38
+ nvidia-smi
39
+
40
+ - name: Tailscale # In order to be able to SSH when a test fails
41
+ uses: huggingface/tailscale-action@main
42
+ with:
43
+ authkey: ${{ secrets.TAILSCALE_SSH_AUTHKEY }}
44
+ slackChannel: ${{ secrets.SLACK_CIFEEDBACK_CHANNEL }}
45
+ slackToken: ${{ secrets.SLACK_CIFEEDBACK_BOT_TOKEN }}
46
+ waitForSSH: true
diffusers/.github/workflows/stale.yml ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Stale Bot
2
+
3
+ on:
4
+ schedule:
5
+ - cron: "0 15 * * *"
6
+
7
+ jobs:
8
+ close_stale_issues:
9
+ name: Close Stale Issues
10
+ if: github.repository == 'huggingface/diffusers'
11
+ runs-on: ubuntu-latest
12
+ env:
13
+ GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
14
+ steps:
15
+ - uses: actions/checkout@v2
16
+
17
+ - name: Setup Python
18
+ uses: actions/setup-python@v1
19
+ with:
20
+ python-version: 3.8
21
+
22
+ - name: Install requirements
23
+ run: |
24
+ pip install PyGithub
25
+ - name: Close stale issues
26
+ run: |
27
+ python utils/stale.py
diffusers/.github/workflows/trufflehog.yml ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ on:
2
+ push:
3
+
4
+ name: Secret Leaks
5
+
6
+ jobs:
7
+ trufflehog:
8
+ runs-on: ubuntu-latest
9
+ steps:
10
+ - name: Checkout code
11
+ uses: actions/checkout@v4
12
+ with:
13
+ fetch-depth: 0
14
+ - name: Secret Scanning
15
+ uses: trufflesecurity/trufflehog@main
diffusers/.github/workflows/typos.yml ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Check typos
2
+
3
+ on:
4
+ workflow_dispatch:
5
+
6
+ jobs:
7
+ build:
8
+ runs-on: ubuntu-latest
9
+
10
+ steps:
11
+ - uses: actions/checkout@v3
12
+
13
+ - name: typos-action
14
+ uses: crate-ci/[email protected]
diffusers/.github/workflows/update_metadata.yml ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Update Diffusers metadata
2
+
3
+ on:
4
+ workflow_dispatch:
5
+ push:
6
+ branches:
7
+ - main
8
+ - update_diffusers_metadata*
9
+
10
+ jobs:
11
+ update_metadata:
12
+ runs-on: ubuntu-22.04
13
+ defaults:
14
+ run:
15
+ shell: bash -l {0}
16
+
17
+ steps:
18
+ - uses: actions/checkout@v3
19
+
20
+ - name: Setup environment
21
+ run: |
22
+ pip install --upgrade pip
23
+ pip install datasets pandas
24
+ pip install .[torch]
25
+
26
+ - name: Update metadata
27
+ env:
28
+ HF_TOKEN: ${{ secrets.SAYAK_HF_TOKEN }}
29
+ run: |
30
+ python utils/update_metadata.py --commit_sha ${{ github.sha }}
diffusers/.github/workflows/upload_pr_documentation.yml ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Upload PR Documentation
2
+
3
+ on:
4
+ workflow_run:
5
+ workflows: ["Build PR Documentation"]
6
+ types:
7
+ - completed
8
+
9
+ jobs:
10
+ build:
11
+ uses: huggingface/doc-builder/.github/workflows/upload_pr_documentation.yml@main
12
+ with:
13
+ package_name: diffusers
14
+ secrets:
15
+ hf_token: ${{ secrets.HF_DOC_BUILD_PUSH }}
16
+ comment_bot_token: ${{ secrets.COMMENT_BOT_TOKEN }}
diffusers/.gitignore ADDED
@@ -0,0 +1,178 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Initially taken from GitHub's Python gitignore file
2
+
3
+ # Byte-compiled / optimized / DLL files
4
+ __pycache__/
5
+ *.py[cod]
6
+ *$py.class
7
+
8
+ # C extensions
9
+ *.so
10
+
11
+ # tests and logs
12
+ tests/fixtures/cached_*_text.txt
13
+ logs/
14
+ lightning_logs/
15
+ lang_code_data/
16
+
17
+ # Distribution / packaging
18
+ .Python
19
+ build/
20
+ develop-eggs/
21
+ dist/
22
+ downloads/
23
+ eggs/
24
+ .eggs/
25
+ lib/
26
+ lib64/
27
+ parts/
28
+ sdist/
29
+ var/
30
+ wheels/
31
+ *.egg-info/
32
+ .installed.cfg
33
+ *.egg
34
+ MANIFEST
35
+
36
+ # PyInstaller
37
+ # Usually these files are written by a Python script from a template
38
+ # before PyInstaller builds the exe, so as to inject date/other infos into it.
39
+ *.manifest
40
+ *.spec
41
+
42
+ # Installer logs
43
+ pip-log.txt
44
+ pip-delete-this-directory.txt
45
+
46
+ # Unit test / coverage reports
47
+ htmlcov/
48
+ .tox/
49
+ .nox/
50
+ .coverage
51
+ .coverage.*
52
+ .cache
53
+ nosetests.xml
54
+ coverage.xml
55
+ *.cover
56
+ .hypothesis/
57
+ .pytest_cache/
58
+
59
+ # Translations
60
+ *.mo
61
+ *.pot
62
+
63
+ # Django stuff:
64
+ *.log
65
+ local_settings.py
66
+ db.sqlite3
67
+
68
+ # Flask stuff:
69
+ instance/
70
+ .webassets-cache
71
+
72
+ # Scrapy stuff:
73
+ .scrapy
74
+
75
+ # Sphinx documentation
76
+ docs/_build/
77
+
78
+ # PyBuilder
79
+ target/
80
+
81
+ # Jupyter Notebook
82
+ .ipynb_checkpoints
83
+
84
+ # IPython
85
+ profile_default/
86
+ ipython_config.py
87
+
88
+ # pyenv
89
+ .python-version
90
+
91
+ # celery beat schedule file
92
+ celerybeat-schedule
93
+
94
+ # SageMath parsed files
95
+ *.sage.py
96
+
97
+ # Environments
98
+ .env
99
+ .venv
100
+ env/
101
+ venv/
102
+ ENV/
103
+ env.bak/
104
+ venv.bak/
105
+
106
+ # Spyder project settings
107
+ .spyderproject
108
+ .spyproject
109
+
110
+ # Rope project settings
111
+ .ropeproject
112
+
113
+ # mkdocs documentation
114
+ /site
115
+
116
+ # mypy
117
+ .mypy_cache/
118
+ .dmypy.json
119
+ dmypy.json
120
+
121
+ # Pyre type checker
122
+ .pyre/
123
+
124
+ # vscode
125
+ .vs
126
+ .vscode
127
+
128
+ # Pycharm
129
+ .idea
130
+
131
+ # TF code
132
+ tensorflow_code
133
+
134
+ # Models
135
+ proc_data
136
+
137
+ # examples
138
+ runs
139
+ /runs_old
140
+ /wandb
141
+ /examples/runs
142
+ /examples/**/*.args
143
+ /examples/rag/sweep
144
+
145
+ # data
146
+ /data
147
+ serialization_dir
148
+
149
+ # emacs
150
+ *.*~
151
+ debug.env
152
+
153
+ # vim
154
+ .*.swp
155
+
156
+ # ctags
157
+ tags
158
+
159
+ # pre-commit
160
+ .pre-commit*
161
+
162
+ # .lock
163
+ *.lock
164
+
165
+ # DS_Store (MacOS)
166
+ .DS_Store
167
+
168
+ # RL pipelines may produce mp4 outputs
169
+ *.mp4
170
+
171
+ # dependencies
172
+ /transformers
173
+
174
+ # ruff
175
+ .ruff_cache
176
+
177
+ # wandb
178
+ wandb