jon-almazan commited on
Commit
1b8e70f
·
verified ·
1 Parent(s): 660e339

Upload 3 files

Browse files
Files changed (3) hide show
  1. LICENSE +202 -0
  2. NOTICE +7 -0
  3. README.md +115 -0
LICENSE ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ Apache License
3
+ Version 2.0, January 2004
4
+ http://www.apache.org/licenses/
5
+
6
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
7
+
8
+ 1. Definitions.
9
+
10
+ "License" shall mean the terms and conditions for use, reproduction,
11
+ and distribution as defined by Sections 1 through 9 of this document.
12
+
13
+ "Licensor" shall mean the copyright owner or entity authorized by
14
+ the copyright owner that is granting the License.
15
+
16
+ "Legal Entity" shall mean the union of the acting entity and all
17
+ other entities that control, are controlled by, or are under common
18
+ control with that entity. For the purposes of this definition,
19
+ "control" means (i) the power, direct or indirect, to cause the
20
+ direction or management of such entity, whether by contract or
21
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
22
+ outstanding shares, or (iii) beneficial ownership of such entity.
23
+
24
+ "You" (or "Your") shall mean an individual or Legal Entity
25
+ exercising permissions granted by this License.
26
+
27
+ "Source" form shall mean the preferred form for making modifications,
28
+ including but not limited to software source code, documentation
29
+ source, and configuration files.
30
+
31
+ "Object" form shall mean any form resulting from mechanical
32
+ transformation or translation of a Source form, including but
33
+ not limited to compiled object code, generated documentation,
34
+ and conversions to other media types.
35
+
36
+ "Work" shall mean the work of authorship, whether in Source or
37
+ Object form, made available under the License, as indicated by a
38
+ copyright notice that is included in or attached to the work
39
+ (an example is provided in the Appendix below).
40
+
41
+ "Derivative Works" shall mean any work, whether in Source or Object
42
+ form, that is based on (or derived from) the Work and for which the
43
+ editorial revisions, annotations, elaborations, or other modifications
44
+ represent, as a whole, an original work of authorship. For the purposes
45
+ of this License, Derivative Works shall not include works that remain
46
+ separable from, or merely link (or bind by name) to the interfaces of,
47
+ the Work and Derivative Works thereof.
48
+
49
+ "Contribution" shall mean any work of authorship, including
50
+ the original version of the Work and any modifications or additions
51
+ to that Work or Derivative Works thereof, that is intentionally
52
+ submitted to Licensor for inclusion in the Work by the copyright owner
53
+ or by an individual or Legal Entity authorized to submit on behalf of
54
+ the copyright owner. For the purposes of this definition, "submitted"
55
+ means any form of electronic, verbal, or written communication sent
56
+ to the Licensor or its representatives, including but not limited to
57
+ communication on electronic mailing lists, source code control systems,
58
+ and issue tracking systems that are managed by, or on behalf of, the
59
+ Licensor for the purpose of discussing and improving the Work, but
60
+ excluding communication that is conspicuously marked or otherwise
61
+ designated in writing by the copyright owner as "Not a Contribution."
62
+
63
+ "Contributor" shall mean Licensor and any individual or Legal Entity
64
+ on behalf of whom a Contribution has been received by Licensor and
65
+ subsequently incorporated within the Work.
66
+
67
+ 2. Grant of Copyright License. Subject to the terms and conditions of
68
+ this License, each Contributor hereby grants to You a perpetual,
69
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70
+ copyright license to reproduce, prepare Derivative Works of,
71
+ publicly display, publicly perform, sublicense, and distribute the
72
+ Work and such Derivative Works in Source or Object form.
73
+
74
+ 3. Grant of Patent License. Subject to the terms and conditions of
75
+ this License, each Contributor hereby grants to You a perpetual,
76
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77
+ (except as stated in this section) patent license to make, have made,
78
+ use, offer to sell, sell, import, and otherwise transfer the Work,
79
+ where such license applies only to those patent claims licensable
80
+ by such Contributor that are necessarily infringed by their
81
+ Contribution(s) alone or by combination of their Contribution(s)
82
+ with the Work to which such Contribution(s) was submitted. If You
83
+ institute patent litigation against any entity (including a
84
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
85
+ or a Contribution incorporated within the Work constitutes direct
86
+ or contributory patent infringement, then any patent licenses
87
+ granted to You under this License for that Work shall terminate
88
+ as of the date such litigation is filed.
89
+
90
+ 4. Redistribution. You may reproduce and distribute copies of the
91
+ Work or Derivative Works thereof in any medium, with or without
92
+ modifications, and in Source or Object form, provided that You
93
+ meet the following conditions:
94
+
95
+ (a) You must give any other recipients of the Work or
96
+ Derivative Works a copy of this License; and
97
+
98
+ (b) You must cause any modified files to carry prominent notices
99
+ stating that You changed the files; and
100
+
101
+ (c) You must retain, in the Source form of any Derivative Works
102
+ that You distribute, all copyright, patent, trademark, and
103
+ attribution notices from the Source form of the Work,
104
+ excluding those notices that do not pertain to any part of
105
+ the Derivative Works; and
106
+
107
+ (d) If the Work includes a "NOTICE" text file as part of its
108
+ distribution, then any Derivative Works that You distribute must
109
+ include a readable copy of the attribution notices contained
110
+ within such NOTICE file, excluding those notices that do not
111
+ pertain to any part of the Derivative Works, in at least one
112
+ of the following places: within a NOTICE text file distributed
113
+ as part of the Derivative Works; within the Source form or
114
+ documentation, if provided along with the Derivative Works; or,
115
+ within a display generated by the Derivative Works, if and
116
+ wherever such third-party notices normally appear. The contents
117
+ of the NOTICE file are for informational purposes only and
118
+ do not modify the License. You may add Your own attribution
119
+ notices within Derivative Works that You distribute, alongside
120
+ or as an addendum to the NOTICE text from the Work, provided
121
+ that such additional attribution notices cannot be construed
122
+ as modifying the License.
123
+
124
+ You may add Your own copyright statement to Your modifications and
125
+ may provide additional or different license terms and conditions
126
+ for use, reproduction, or distribution of Your modifications, or
127
+ for any such Derivative Works as a whole, provided Your use,
128
+ reproduction, and distribution of the Work otherwise complies with
129
+ the conditions stated in this License.
130
+
131
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
132
+ any Contribution intentionally submitted for inclusion in the Work
133
+ by You to the Licensor shall be under the terms and conditions of
134
+ this License, without any additional terms or conditions.
135
+ Notwithstanding the above, nothing herein shall supersede or modify
136
+ the terms of any separate license agreement you may have executed
137
+ with Licensor regarding such Contributions.
138
+
139
+ 6. Trademarks. This License does not grant permission to use the trade
140
+ names, trademarks, service marks, or product names of the Licensor,
141
+ except as required for reasonable and customary use in describing the
142
+ origin of the Work and reproducing the content of the NOTICE file.
143
+
144
+ 7. Disclaimer of Warranty. Unless required by applicable law or
145
+ agreed to in writing, Licensor provides the Work (and each
146
+ Contributor provides its Contributions) on an "AS IS" BASIS,
147
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148
+ implied, including, without limitation, any warranties or conditions
149
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150
+ PARTICULAR PURPOSE. You are solely responsible for determining the
151
+ appropriateness of using or redistributing the Work and assume any
152
+ risks associated with Your exercise of permissions under this License.
153
+
154
+ 8. Limitation of Liability. In no event and under no legal theory,
155
+ whether in tort (including negligence), contract, or otherwise,
156
+ unless required by applicable law (such as deliberate and grossly
157
+ negligent acts) or agreed to in writing, shall any Contributor be
158
+ liable to You for damages, including any direct, indirect, special,
159
+ incidental, or consequential damages of any character arising as a
160
+ result of this License or out of the use or inability to use the
161
+ Work (including but not limited to damages for loss of goodwill,
162
+ work stoppage, computer failure or malfunction, or any and all
163
+ other commercial damages or losses), even if such Contributor
164
+ has been advised of the possibility of such damages.
165
+
166
+ 9. Accepting Warranty or Additional Liability. While redistributing
167
+ the Work or Derivative Works thereof, You may choose to offer,
168
+ and charge a fee for, acceptance of support, warranty, indemnity,
169
+ or other liability obligations and/or rights consistent with this
170
+ License. However, in accepting such obligations, You may act only
171
+ on Your own behalf and on Your sole responsibility, not on behalf
172
+ of any other Contributor, and only if You agree to indemnify,
173
+ defend, and hold each Contributor harmless for any liability
174
+ incurred by, or claims asserted against, such Contributor by reason
175
+ of your accepting any such warranty or additional liability.
176
+
177
+ END OF TERMS AND CONDITIONS
178
+
179
+ APPENDIX: How to apply the Apache License to your work.
180
+
181
+ To apply the Apache License to your work, attach the following
182
+ boilerplate notice, with the fields enclosed by brackets "[]"
183
+ replaced with your own identifying information. (Don't include
184
+ the brackets!) The text should be enclosed in the appropriate
185
+ comment syntax for the file format. We also recommend that a
186
+ file or class name and description of purpose be included on the
187
+ same "printed page" as the copyright notice for easier
188
+ identification within third-party archives.
189
+
190
+ Copyright [yyyy] [name of copyright owner]
191
+
192
+ Licensed under the Apache License, Version 2.0 (the "License");
193
+ you may not use this file except in compliance with the License.
194
+ You may obtain a copy of the License at
195
+
196
+ http://www.apache.org/licenses/LICENSE-2.0
197
+
198
+ Unless required by applicable law or agreed to in writing, software
199
+ distributed under the License is distributed on an "AS IS" BASIS,
200
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201
+ See the License for the specific language governing permissions and
202
+ limitations under the License.
NOTICE ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ Photoroom PRX
2
+ Copyright 2025 Photoroom Inc..
3
+
4
+ This release includes the model weights of Google's T5 Gemma 2B-2B text encoder available at https://huggingface.co/google/t5gemma-2b-2b-ul2, as well as the model weights of the Flux VAE part of the Flux.1 [schnell] model available at https://huggingface.co/black-forest-labs/FLUX.1-schnell.
5
+
6
+ Gemma is provided under and subject to the Gemma Terms of Use found at ai.google.dev/gemma/terms.
7
+ Flux.1 [schnell] is licensed under the Apache License 2.0.
README.md ADDED
@@ -0,0 +1,115 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pipeline_tag: text-to-image
3
+ library_name: diffusers
4
+ license: apache-2.0
5
+ tags:
6
+ - diffusion
7
+ - text-to-image
8
+ - photoroom
9
+ - prx
10
+ - open-source
11
+ - image-generation
12
+ - flow-matching
13
+ demo: https://huggingface.co/spaces/Photoroom/PRX-1024-beta-version
14
+ model_type: diffusion-transformer
15
+ inference: true
16
+ ---
17
+
18
+ # PRX: Open Text-to-Image Generative Model
19
+
20
+ ![PRX](https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/ZiHS6kQv64EArhBcv7_Yk.jpeg)
21
+
22
+ **PRX (Photoroom Experimental)** is a **1.3-billion-parameter text-to-image model trained entirely from scratch** and released under an **Apache 2.0 license**.
23
+
24
+ It is part of Photoroom’s broader effort to **open-source the complete process** behind training large-scale text-to-image models — covering architecture design, optimization strategies, and post-training alignment. The goal is to make PRX both a **strong open baseline** and a **transparent research reference** for those developing or studying diffusion-transformer models.
25
+
26
+ For more information, please read our [announcement blog post](https://huggingface.co/blog/Photoroom/prx-open-source-t2i-model).
27
+
28
+ ## Model description
29
+
30
+ PRX is designed to be **lightweight yet capable**, easy to fine-tune or extend, and fully open.
31
+
32
+ PRX generates high-quality images from text using a simplified MMDiT architecture where text tokens don’t update through transformer blocks. It uses flow matching with discrete scheduling for efficient sampling and Google’s T5-Gemma-2B-2B-UL2 model for multilingual text encoding. The model has around **1.3B parameters** and delivers fast inference without sacrificing quality. You can choose between **Flux VAE** for balanced quality and speed, or **DC-AE** for higher latent compression and faster processing.
33
+
34
+ This card in particular describes `Photoroom/prx-256-t2i-sft`, one of the PRX model variants:
35
+
36
+ - **Resolution:** 256 pixels
37
+ - **Architecture:** PRX (MMDiT-like diffusion transformer variant)
38
+ - **Latent backbone:** Flux's VAE
39
+ - **Text encoder:** T5-Gemma-2B-2B-UL2
40
+ - **Training stage:** Supervised fine-tuning (SFT)
41
+ - **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
42
+
43
+ For other checkpoints, browse the full [PRX collection](https://huggingface.co/collections/Photoroom/prx).
44
+
45
+ ## Example usage
46
+
47
+ You can use PRX directly in [Diffusers](https://huggingface.co/docs/diffusers/main/en/api/pipelines/prx):
48
+
49
+ ```python
50
+ from diffusers.pipelines.prx import PRXPipeline
51
+
52
+ pipe = PRXPipeline.from_pretrained(
53
+ "Photoroom/prx-256-t2i-sft",
54
+ torch_dtype=torch.bfloat16
55
+ ).to("cuda")
56
+
57
+ prompt = "A front-facing portrait of a lion in the golden savanna at sunset"
58
+ image = pipe(prompt, num_inference_steps=28, guidance_scale=5.0).images[0]
59
+ image.save("lion.png")
60
+ ```
61
+
62
+ ## Visual examples and demo
63
+
64
+ Here are some examples from one of our best checkpoints so far ([Photoroom/prx-1024-t2i-beta](https://huggingface.co/Photoroom/prx-1024-t2i-beta)).
65
+
66
+ <div style="display:flex; justify-content:center; width:100%;">
67
+ <table style="border-collapse:collapse; width:100%; max-width:900px; table-layout:fixed;">
68
+ <tr>
69
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/ljWO7dK-_CKypruXcApeN.webp" style="width:100%; height:auto;"/></td>
70
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/IDHiXpRlUISeJxXtJM6fW.webp" style="width:100%; height:auto;"/></td>
71
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/HemYHcMexWnAuYYor5Ztx.webp" style="width:100%; height:auto;"/></td>
72
+ </tr>
73
+ <tr>
74
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/kEUd7dO_30ngn__scTH3M.webp" style="width:100%; height:auto;"/></td>
75
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/jGkseXch9HWfB48Z-k5OX.webp" style="width:100%; height:auto;"/></td>
76
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/5YnGFBiM1IHrzLh2h7q7t.webp" style="width:100%; height:auto;"/></td>
77
+ </tr>
78
+ <tr>
79
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/OrMntTSvpE8GH1YrBNgZD.webp" style="width:100%; height:auto;"/></td>
80
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/Aglz2CljITrEY4V-Q-P60.webp" style="width:100%; height:auto;"/></td>
81
+ <td><img src="https://cdn-uploads.huggingface.co/production/uploads/68d136d7307413e80188d819/h47OBkGOsKVmq51KSRaRu.webp" style="width:100%; height:auto;"/></td>
82
+ </tr>
83
+ </table>
84
+ </div>
85
+
86
+ [PRX Demo on Hugging Face Spaces](https://huggingface.co/spaces/Photoroom/PRX-1024-beta-version) — interactive text-to-image demo for `Photoroom/prx-1024-t2i-beta`.
87
+
88
+ ## Training details
89
+
90
+ PRX models were trained from scratch using recent advances in diffusion and flow-matching training. We experimented with a range of modern techniques for efficiency, stability, and alignment, which we’ll cover in more detail in our upcoming series of research posts:
91
+
92
+ - [Part 0: Overview and release](https://huggingface.co/blog/Photoroom/prx-open-source-t2i-model)
93
+ - Part 1: Design experiments and architecture benchmark *(coming soon)*
94
+ - Part 2: Accelerating training *(coming soon)*
95
+ - Part 3: Post-pretraining *(coming soon)*
96
+
97
+ ## Other PRX models
98
+
99
+ You can find additional checkpoints in the [PRX collection](https://huggingface.co/collections/Photoroom/prx):
100
+ - **Base** – pretrained model before alignment; best starting point for fine-tuning or research
101
+ - **SFT** — supervised fine-tuned model; produces more aesthetically pleasing, ready-to-use generations
102
+ - **Latent backbones** — Flux's and DC-AE VAEs
103
+ - **Distilled** – 8-step generation with LADD
104
+ - **Resolutions** – 256, 512, and 1024 pixels
105
+
106
+ ## License
107
+
108
+ PRX is available under an **Apache 2.0 license**.
109
+
110
+ ## Use restrictions
111
+
112
+ You must not use PRX models for:
113
+
114
+ 1. any of the restricted uses set forth in the [Gemma Prohibited Use Policy](ai.google.dev/gemma/prohibited_use_policy);
115
+ 2. or any activity that violates applicable laws or regulations.