Text-to-Image
English
xueyao commited on
Commit
d673c67
·
1 Parent(s): 110cc77

Inital commit

Browse files
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.png filter=lfs diff=lfs merge=lfs -text
37
+ *.jpg filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: stabilityai-ai-community
4
+ license_link: LICENSE.md
5
+ language:
6
+ - en
7
+ base_model:
8
+ - stabilityai/stable-diffusion-3.5-medium
9
+ pipeline_tag: text-to-image
10
+ ---
11
+
12
+ <div align="center">
13
+
14
+ **Bokeh_Pose_Controlnet**
15
+
16
+ <img src="show.jpg"/>
17
+ </div>
18
+
19
+ ## Description
20
+
21
+ - Input Image: Images processed by openpose/dwpose detection
22
+ - Output Image: Generated images with pose control
23
+
24
+ Openpose enables precise human body structure control, demonstrating better control effects and generalization compared to sd1.5 cn in our tests, and can easily adapt to lora models
25
+
26
+ ## Example
27
+ | input | output | Prompt |
28
+ |:---:|:---:|:---|
29
+ | <img src="./images/001_pose.png" width="300"/> | <img src="./images/001.png" width="300"/> | 1 girl , thinking |
30
+ | <img src="./images/002_pose.png" width="300"/> | <img src="./images/002.png" width="300"/> | a man |
31
+ | <img src="./images/003_pose.png" width="300"/> | <img src="./images/003.png" width="300"/> | a young woman in room,wear a brown short,golden short hair |
32
+ | <img src="./images/004_pose.png" width="300"/> | <img src="./images/004.png" width="300"/> | a man in room,wear a brown vintage shirt,1990s |
33
+
34
+ ## Use
35
+ We recommend using ComfyUI for local inference
36
+ ![input](./comfy.png)
37
+
38
+ # With Bokeh
39
+ ```python
40
+ import torch
41
+ from diffusers import StableDiffusion3ControlNetPipeline
42
+ from diffusers import SD3ControlNetModel
43
+ from diffusers.utils import load_image
44
+
45
+ controlnet = SD3ControlNetModel.from_pretrained("tensorart/Bokeh_Openpose_Controlnet")
46
+ pipe = StableDiffusion3ControlNetPipeline.from_pretrained(
47
+ "tensorart/bokeh_3.5_medium",
48
+ controlnet=controlnet
49
+ )
50
+ pipe.to("cuda", torch.float16)
51
+
52
+ control_image = load_image("https://huggingface.co/tensorart/Bokeh_Pose_Control/resolve/main/images/001_pose.png")
53
+ prompt = "A cat looks up, close-up, sapphire eyes"
54
+ negative_prompt ="anime,render,cartoon,3d"
55
+ negative_prompt_3=""
56
+
57
+ image = pipe(
58
+ prompt,
59
+ num_inference_steps=30,
60
+ negative_prompt=negative_prompt,
61
+ control_image=control_image,
62
+ height=1728,
63
+ width=1152,
64
+ guidance_scale=4,
65
+ controlnet_conditioning_scale=0.8
66
+ ).images[0]
67
+ image.save('image.jpg')
68
+ ```
69
+
70
+ ## Contact
71
+ * Website: https://tensor.art https://tusiart.com
72
+ * Developed by: TensorArt
73
+ * Api: https://tams.tensor.art/
74
+
75
+
76
+
77
+
bokeh_cn_workflow.json ADDED
@@ -0,0 +1,869 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "last_node_id": 93,
3
+ "last_link_id": 370,
4
+ "nodes": [
5
+ {
6
+ "id": 17,
7
+ "type": "PrimitiveNode",
8
+ "pos": {
9
+ "0": 40,
10
+ "1": -202
11
+ },
12
+ "size": {
13
+ "0": 298.4656982421875,
14
+ "1": 99.21678161621094
15
+ },
16
+ "flags": {
17
+ "collapsed": false
18
+ },
19
+ "order": 0,
20
+ "mode": 0,
21
+ "inputs": [],
22
+ "outputs": [
23
+ {
24
+ "name": "INT",
25
+ "type": "INT",
26
+ "links": [
27
+ 288
28
+ ],
29
+ "slot_index": 0,
30
+ "widget": {
31
+ "name": "seed"
32
+ }
33
+ }
34
+ ],
35
+ "title": "seed",
36
+ "properties": {
37
+ "Run widget replace on values": false
38
+ },
39
+ "widgets_values": [
40
+ 349829797374618,
41
+ "increment"
42
+ ]
43
+ },
44
+ {
45
+ "id": 30,
46
+ "type": "CLIPTextEncodeSD3",
47
+ "pos": {
48
+ "0": 462,
49
+ "1": -460
50
+ },
51
+ "size": {
52
+ "0": 346.1516418457031,
53
+ "1": 140
54
+ },
55
+ "flags": {
56
+ "collapsed": false
57
+ },
58
+ "order": 9,
59
+ "mode": 0,
60
+ "inputs": [
61
+ {
62
+ "name": "clip",
63
+ "type": "CLIP",
64
+ "link": 370
65
+ },
66
+ {
67
+ "name": "clip_g",
68
+ "type": "STRING",
69
+ "link": 222,
70
+ "widget": {
71
+ "name": "clip_g"
72
+ }
73
+ },
74
+ {
75
+ "name": "clip_l",
76
+ "type": "STRING",
77
+ "link": 223,
78
+ "widget": {
79
+ "name": "clip_l"
80
+ }
81
+ }
82
+ ],
83
+ "outputs": [
84
+ {
85
+ "name": "CONDITIONING",
86
+ "type": "CONDITIONING",
87
+ "links": [
88
+ 347
89
+ ],
90
+ "slot_index": 0
91
+ }
92
+ ],
93
+ "properties": {
94
+ "Node name for S&R": "CLIPTextEncodeSD3"
95
+ },
96
+ "widgets_values": [
97
+ "anime,render,cartoon,3d",
98
+ "anime,render,cartoon,3d",
99
+ "",
100
+ "none"
101
+ ]
102
+ },
103
+ {
104
+ "id": 32,
105
+ "type": "CLIPTextEncodeSD3",
106
+ "pos": {
107
+ "0": 462,
108
+ "1": -626
109
+ },
110
+ "size": {
111
+ "0": 344.10260009765625,
112
+ "1": 118
113
+ },
114
+ "flags": {
115
+ "collapsed": false
116
+ },
117
+ "order": 8,
118
+ "mode": 0,
119
+ "inputs": [
120
+ {
121
+ "name": "clip",
122
+ "type": "CLIP",
123
+ "link": 369
124
+ },
125
+ {
126
+ "name": "clip_g",
127
+ "type": "STRING",
128
+ "link": 122,
129
+ "widget": {
130
+ "name": "clip_g"
131
+ }
132
+ },
133
+ {
134
+ "name": "clip_l",
135
+ "type": "STRING",
136
+ "link": 123,
137
+ "widget": {
138
+ "name": "clip_l"
139
+ }
140
+ },
141
+ {
142
+ "name": "t5xxl",
143
+ "type": "STRING",
144
+ "link": 152,
145
+ "widget": {
146
+ "name": "t5xxl"
147
+ }
148
+ }
149
+ ],
150
+ "outputs": [
151
+ {
152
+ "name": "CONDITIONING",
153
+ "type": "CONDITIONING",
154
+ "links": [
155
+ 346
156
+ ],
157
+ "slot_index": 0
158
+ }
159
+ ],
160
+ "properties": {
161
+ "Node name for S&R": "CLIPTextEncodeSD3"
162
+ },
163
+ "widgets_values": [
164
+ "a beauty woman in room,1990s style,hot,vintage",
165
+ "a beauty woman in room,1990s style,hot,vintage",
166
+ "a beauty woman in room,1990s style,hot,vintage",
167
+ "empty_prompt"
168
+ ]
169
+ },
170
+ {
171
+ "id": 13,
172
+ "type": "CheckpointLoaderSimple",
173
+ "pos": {
174
+ "0": -402,
175
+ "1": -686
176
+ },
177
+ "size": {
178
+ "0": 746.5433349609375,
179
+ "1": 98
180
+ },
181
+ "flags": {},
182
+ "order": 1,
183
+ "mode": 0,
184
+ "inputs": [],
185
+ "outputs": [
186
+ {
187
+ "name": "MODEL",
188
+ "type": "MODEL",
189
+ "links": [
190
+ 368
191
+ ],
192
+ "slot_index": 0
193
+ },
194
+ {
195
+ "name": "CLIP",
196
+ "type": "CLIP",
197
+ "links": [
198
+ 369,
199
+ 370
200
+ ],
201
+ "slot_index": 1
202
+ },
203
+ {
204
+ "name": "VAE",
205
+ "type": "VAE",
206
+ "links": [
207
+ 292,
208
+ 351
209
+ ],
210
+ "slot_index": 2
211
+ }
212
+ ],
213
+ "properties": {
214
+ "Node name for S&R": "CheckpointLoaderSimple"
215
+ },
216
+ "widgets_values": [
217
+ "bokeh_medium_t5xxlfp8.safetensors"
218
+ ]
219
+ },
220
+ {
221
+ "id": 72,
222
+ "type": "PreviewImage",
223
+ "pos": {
224
+ "0": 1956,
225
+ "1": -611
226
+ },
227
+ "size": {
228
+ "0": 1634.2325439453125,
229
+ "1": 1421.76123046875
230
+ },
231
+ "flags": {},
232
+ "order": 14,
233
+ "mode": 0,
234
+ "inputs": [
235
+ {
236
+ "name": "images",
237
+ "type": "IMAGE",
238
+ "link": 293
239
+ }
240
+ ],
241
+ "outputs": [],
242
+ "properties": {
243
+ "Node name for S&R": "PreviewImage"
244
+ },
245
+ "widgets_values": []
246
+ },
247
+ {
248
+ "id": 89,
249
+ "type": "PreviewImage",
250
+ "pos": {
251
+ "0": 473,
252
+ "1": 13
253
+ },
254
+ "size": {
255
+ "0": 1328.8507080078125,
256
+ "1": 952.4451904296875
257
+ },
258
+ "flags": {},
259
+ "order": 10,
260
+ "mode": 0,
261
+ "inputs": [
262
+ {
263
+ "name": "images",
264
+ "type": "IMAGE",
265
+ "link": 355
266
+ }
267
+ ],
268
+ "outputs": [],
269
+ "properties": {
270
+ "Node name for S&R": "PreviewImage"
271
+ },
272
+ "widgets_values": []
273
+ },
274
+ {
275
+ "id": 85,
276
+ "type": "ControlNetApplyAdvanced",
277
+ "pos": {
278
+ "0": 949,
279
+ "1": -574
280
+ },
281
+ "size": {
282
+ "0": 268.9471435546875,
283
+ "1": 186
284
+ },
285
+ "flags": {},
286
+ "order": 11,
287
+ "mode": 0,
288
+ "inputs": [
289
+ {
290
+ "name": "positive",
291
+ "type": "CONDITIONING",
292
+ "link": 346
293
+ },
294
+ {
295
+ "name": "negative",
296
+ "type": "CONDITIONING",
297
+ "link": 347
298
+ },
299
+ {
300
+ "name": "control_net",
301
+ "type": "CONTROL_NET",
302
+ "link": 345
303
+ },
304
+ {
305
+ "name": "image",
306
+ "type": "IMAGE",
307
+ "link": 354
308
+ },
309
+ {
310
+ "name": "vae",
311
+ "type": "VAE",
312
+ "link": 351,
313
+ "shape": 7
314
+ }
315
+ ],
316
+ "outputs": [
317
+ {
318
+ "name": "positive",
319
+ "type": "CONDITIONING",
320
+ "links": [
321
+ 348
322
+ ],
323
+ "slot_index": 0
324
+ },
325
+ {
326
+ "name": "negative",
327
+ "type": "CONDITIONING",
328
+ "links": [
329
+ 349
330
+ ],
331
+ "slot_index": 1
332
+ }
333
+ ],
334
+ "properties": {
335
+ "Node name for S&R": "ControlNetApplyAdvanced"
336
+ },
337
+ "widgets_values": [
338
+ 0.7000000000000001,
339
+ 0,
340
+ 1
341
+ ]
342
+ },
343
+ {
344
+ "id": 86,
345
+ "type": "LoadImage",
346
+ "pos": {
347
+ "0": -409,
348
+ "1": -537
349
+ },
350
+ "size": {
351
+ "0": 413.1280822753906,
352
+ "1": 623.165771484375
353
+ },
354
+ "flags": {
355
+ "collapsed": false
356
+ },
357
+ "order": 2,
358
+ "mode": 0,
359
+ "inputs": [],
360
+ "outputs": [
361
+ {
362
+ "name": "IMAGE",
363
+ "type": "IMAGE",
364
+ "links": [
365
+ 366
366
+ ],
367
+ "slot_index": 0
368
+ },
369
+ {
370
+ "name": "MASK",
371
+ "type": "MASK",
372
+ "links": null
373
+ }
374
+ ],
375
+ "properties": {
376
+ "Node name for S&R": "LoadImage"
377
+ },
378
+ "widgets_values": [
379
+ "ComfyUI_temp_pkzyz_00023_.png",
380
+ "image"
381
+ ]
382
+ },
383
+ {
384
+ "id": 20,
385
+ "type": "PrimitiveNode",
386
+ "pos": {
387
+ "0": 32,
388
+ "1": -543
389
+ },
390
+ "size": {
391
+ "0": 314.90277099609375,
392
+ "1": 150.342529296875
393
+ },
394
+ "flags": {
395
+ "collapsed": false
396
+ },
397
+ "order": 3,
398
+ "mode": 0,
399
+ "inputs": [],
400
+ "outputs": [
401
+ {
402
+ "name": "STRING",
403
+ "type": "STRING",
404
+ "links": [
405
+ 122,
406
+ 123,
407
+ 152
408
+ ],
409
+ "slot_index": 0,
410
+ "widget": {
411
+ "name": "clip_g"
412
+ }
413
+ }
414
+ ],
415
+ "title": "Positive_prompt",
416
+ "properties": {
417
+ "Run widget replace on values": false
418
+ },
419
+ "widgets_values": [
420
+ "a beauty woman in room,1990s style,hot,vintage"
421
+ ]
422
+ },
423
+ {
424
+ "id": 88,
425
+ "type": "ImageScale",
426
+ "pos": {
427
+ "0": 24,
428
+ "1": -49
429
+ },
430
+ "size": {
431
+ "0": 315,
432
+ "1": 130
433
+ },
434
+ "flags": {},
435
+ "order": 7,
436
+ "mode": 0,
437
+ "inputs": [
438
+ {
439
+ "name": "image",
440
+ "type": "IMAGE",
441
+ "link": 366
442
+ }
443
+ ],
444
+ "outputs": [
445
+ {
446
+ "name": "IMAGE",
447
+ "type": "IMAGE",
448
+ "links": [
449
+ 354,
450
+ 355
451
+ ],
452
+ "slot_index": 0
453
+ }
454
+ ],
455
+ "properties": {
456
+ "Node name for S&R": "ImageScale"
457
+ },
458
+ "widgets_values": [
459
+ "nearest-exact",
460
+ 1280,
461
+ 1664,
462
+ "disabled"
463
+ ]
464
+ },
465
+ {
466
+ "id": 87,
467
+ "type": "EmptyLatentImage",
468
+ "pos": {
469
+ "0": 464,
470
+ "1": -140
471
+ },
472
+ "size": {
473
+ "0": 352.0178527832031,
474
+ "1": 106
475
+ },
476
+ "flags": {},
477
+ "order": 4,
478
+ "mode": 0,
479
+ "inputs": [],
480
+ "outputs": [
481
+ {
482
+ "name": "LATENT",
483
+ "type": "LATENT",
484
+ "links": [
485
+ 367
486
+ ],
487
+ "slot_index": 0
488
+ }
489
+ ],
490
+ "properties": {
491
+ "Node name for S&R": "EmptyLatentImage"
492
+ },
493
+ "widgets_values": [
494
+ 1280,
495
+ 1664,
496
+ 1
497
+ ]
498
+ },
499
+ {
500
+ "id": 71,
501
+ "type": "VAEDecode",
502
+ "pos": {
503
+ "0": 1500,
504
+ "1": -377
505
+ },
506
+ "size": {
507
+ "0": 210,
508
+ "1": 46
509
+ },
510
+ "flags": {},
511
+ "order": 13,
512
+ "mode": 0,
513
+ "inputs": [
514
+ {
515
+ "name": "samples",
516
+ "type": "LATENT",
517
+ "link": 291
518
+ },
519
+ {
520
+ "name": "vae",
521
+ "type": "VAE",
522
+ "link": 292
523
+ }
524
+ ],
525
+ "outputs": [
526
+ {
527
+ "name": "IMAGE",
528
+ "type": "IMAGE",
529
+ "links": [
530
+ 293
531
+ ],
532
+ "slot_index": 0
533
+ }
534
+ ],
535
+ "properties": {
536
+ "Node name for S&R": "VAEDecode"
537
+ },
538
+ "widgets_values": []
539
+ },
540
+ {
541
+ "id": 52,
542
+ "type": "PrimitiveNode",
543
+ "pos": {
544
+ "0": 33,
545
+ "1": -353
546
+ },
547
+ "size": {
548
+ "0": 309.40350341796875,
549
+ "1": 96.14453887939453
550
+ },
551
+ "flags": {
552
+ "collapsed": false
553
+ },
554
+ "order": 5,
555
+ "mode": 0,
556
+ "inputs": [],
557
+ "outputs": [
558
+ {
559
+ "name": "STRING",
560
+ "type": "STRING",
561
+ "links": [
562
+ 222,
563
+ 223
564
+ ],
565
+ "slot_index": 0,
566
+ "widget": {
567
+ "name": "clip_g"
568
+ }
569
+ }
570
+ ],
571
+ "title": "Negative_prompt",
572
+ "properties": {
573
+ "Run widget replace on values": false
574
+ },
575
+ "widgets_values": [
576
+ "anime,render,cartoon,3d"
577
+ ]
578
+ },
579
+ {
580
+ "id": 84,
581
+ "type": "ControlNetLoader",
582
+ "pos": {
583
+ "0": 466,
584
+ "1": -268
585
+ },
586
+ "size": [
587
+ 345.26156928165994,
588
+ 66.37418251272618
589
+ ],
590
+ "flags": {},
591
+ "order": 6,
592
+ "mode": 0,
593
+ "inputs": [],
594
+ "outputs": [
595
+ {
596
+ "name": "CONTROL_NET",
597
+ "type": "CONTROL_NET",
598
+ "links": [
599
+ 345
600
+ ],
601
+ "slot_index": 0
602
+ }
603
+ ],
604
+ "properties": {
605
+ "Node name for S&R": "ControlNetLoader"
606
+ },
607
+ "widgets_values": [
608
+ "bokeh_pose_cn.safetensors"
609
+ ]
610
+ },
611
+ {
612
+ "id": 70,
613
+ "type": "KSampler",
614
+ "pos": {
615
+ "0": 951,
616
+ "1": -341
617
+ },
618
+ "size": [
619
+ 266.95740211981797,
620
+ 234
621
+ ],
622
+ "flags": {},
623
+ "order": 12,
624
+ "mode": 0,
625
+ "inputs": [
626
+ {
627
+ "name": "model",
628
+ "type": "MODEL",
629
+ "link": 368
630
+ },
631
+ {
632
+ "name": "positive",
633
+ "type": "CONDITIONING",
634
+ "link": 348
635
+ },
636
+ {
637
+ "name": "negative",
638
+ "type": "CONDITIONING",
639
+ "link": 349
640
+ },
641
+ {
642
+ "name": "latent_image",
643
+ "type": "LATENT",
644
+ "link": 367
645
+ },
646
+ {
647
+ "name": "seed",
648
+ "type": "INT",
649
+ "link": 288,
650
+ "widget": {
651
+ "name": "seed"
652
+ }
653
+ }
654
+ ],
655
+ "outputs": [
656
+ {
657
+ "name": "LATENT",
658
+ "type": "LATENT",
659
+ "links": [
660
+ 291
661
+ ],
662
+ "slot_index": 0
663
+ }
664
+ ],
665
+ "properties": {
666
+ "Node name for S&R": "KSampler"
667
+ },
668
+ "widgets_values": [
669
+ 349829797374618,
670
+ "randomize",
671
+ 28,
672
+ 4.5,
673
+ "dpmpp_2m",
674
+ "sgm_uniform",
675
+ 1
676
+ ]
677
+ }
678
+ ],
679
+ "links": [
680
+ [
681
+ 122,
682
+ 20,
683
+ 0,
684
+ 32,
685
+ 1,
686
+ "STRING"
687
+ ],
688
+ [
689
+ 123,
690
+ 20,
691
+ 0,
692
+ 32,
693
+ 2,
694
+ "STRING"
695
+ ],
696
+ [
697
+ 152,
698
+ 20,
699
+ 0,
700
+ 32,
701
+ 3,
702
+ "STRING"
703
+ ],
704
+ [
705
+ 222,
706
+ 52,
707
+ 0,
708
+ 30,
709
+ 1,
710
+ "STRING"
711
+ ],
712
+ [
713
+ 223,
714
+ 52,
715
+ 0,
716
+ 30,
717
+ 2,
718
+ "STRING"
719
+ ],
720
+ [
721
+ 288,
722
+ 17,
723
+ 0,
724
+ 70,
725
+ 4,
726
+ "INT"
727
+ ],
728
+ [
729
+ 291,
730
+ 70,
731
+ 0,
732
+ 71,
733
+ 0,
734
+ "LATENT"
735
+ ],
736
+ [
737
+ 292,
738
+ 13,
739
+ 2,
740
+ 71,
741
+ 1,
742
+ "VAE"
743
+ ],
744
+ [
745
+ 293,
746
+ 71,
747
+ 0,
748
+ 72,
749
+ 0,
750
+ "IMAGE"
751
+ ],
752
+ [
753
+ 345,
754
+ 84,
755
+ 0,
756
+ 85,
757
+ 2,
758
+ "CONTROL_NET"
759
+ ],
760
+ [
761
+ 346,
762
+ 32,
763
+ 0,
764
+ 85,
765
+ 0,
766
+ "CONDITIONING"
767
+ ],
768
+ [
769
+ 347,
770
+ 30,
771
+ 0,
772
+ 85,
773
+ 1,
774
+ "CONDITIONING"
775
+ ],
776
+ [
777
+ 348,
778
+ 85,
779
+ 0,
780
+ 70,
781
+ 1,
782
+ "CONDITIONING"
783
+ ],
784
+ [
785
+ 349,
786
+ 85,
787
+ 1,
788
+ 70,
789
+ 2,
790
+ "CONDITIONING"
791
+ ],
792
+ [
793
+ 351,
794
+ 13,
795
+ 2,
796
+ 85,
797
+ 4,
798
+ "VAE"
799
+ ],
800
+ [
801
+ 354,
802
+ 88,
803
+ 0,
804
+ 85,
805
+ 3,
806
+ "IMAGE"
807
+ ],
808
+ [
809
+ 355,
810
+ 88,
811
+ 0,
812
+ 89,
813
+ 0,
814
+ "IMAGE"
815
+ ],
816
+ [
817
+ 366,
818
+ 86,
819
+ 0,
820
+ 88,
821
+ 0,
822
+ "IMAGE"
823
+ ],
824
+ [
825
+ 367,
826
+ 87,
827
+ 0,
828
+ 70,
829
+ 3,
830
+ "LATENT"
831
+ ],
832
+ [
833
+ 368,
834
+ 13,
835
+ 0,
836
+ 70,
837
+ 0,
838
+ "MODEL"
839
+ ],
840
+ [
841
+ 369,
842
+ 13,
843
+ 1,
844
+ 32,
845
+ 0,
846
+ "CLIP"
847
+ ],
848
+ [
849
+ 370,
850
+ 13,
851
+ 1,
852
+ 30,
853
+ 0,
854
+ "CLIP"
855
+ ]
856
+ ],
857
+ "groups": [],
858
+ "config": {},
859
+ "extra": {
860
+ "ds": {
861
+ "scale": 0.520986848192522,
862
+ "offset": [
863
+ 285.4133429877194,
864
+ 646.3946136967473
865
+ ]
866
+ }
867
+ },
868
+ "version": 0.4
869
+ }
comfy.png ADDED

Git LFS Details

  • SHA256: ac0b5a1a07962872b97006046cde5c42abc6b710bb00eed810a30905a6cdf749
  • Pointer size: 132 Bytes
  • Size of remote file: 3.2 MB
config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "SD3ControlNetModel",
3
+ "_diffusers_version": "0.32.0.dev0",
4
+ "_name_or_path": "tensorart/bokeh_3.5_medium",
5
+ "attention_head_dim": 64,
6
+ "caption_projection_dim": 1536,
7
+ "dual_attention_layers": [
8
+ 0,
9
+ 1,
10
+ 2,
11
+ 3,
12
+ 4,
13
+ 5,
14
+ 6,
15
+ 7,
16
+ 8,
17
+ 9,
18
+ 10,
19
+ 11,
20
+ 12
21
+ ],
22
+ "extra_conditioning_channels": 0,
23
+ "in_channels": 16,
24
+ "joint_attention_dim": 4096,
25
+ "num_attention_heads": 24,
26
+ "num_layers": 23,
27
+ "out_channels": 16,
28
+ "patch_size": 2,
29
+ "pooled_projection_dim": 2048,
30
+ "pos_embed_max_size": 384,
31
+ "qk_norm": "rms_norm",
32
+ "sample_size": 128
33
+ }
images/001.png ADDED

Git LFS Details

  • SHA256: 0587157aee1b7205c3ec6c4809bd637072922e18a785545786737023e31d960b
  • Pointer size: 132 Bytes
  • Size of remote file: 2.15 MB
images/001.webp ADDED
images/001_pose.png ADDED

Git LFS Details

  • SHA256: 921ed9012ba4f8cd3ac5a372439b9501328bc805bdc7efb746cecdf613f95708
  • Pointer size: 130 Bytes
  • Size of remote file: 50.6 kB
images/002.png ADDED

Git LFS Details

  • SHA256: 428a1bf781f0ddd0dd0f5b3708131123d820636111b0110c2a4d029e108f6a38
  • Pointer size: 132 Bytes
  • Size of remote file: 2.12 MB
images/002_pose.png ADDED

Git LFS Details

  • SHA256: cd76377445ddf8c2afee82603b82dfa91650272cbe67853f4f8cdea603dc06cd
  • Pointer size: 131 Bytes
  • Size of remote file: 107 kB
images/003.png ADDED

Git LFS Details

  • SHA256: 5293e7ddbec4e1b76f8b58242df9cefd9e28bbe139498f586be29e25aebfc770
  • Pointer size: 132 Bytes
  • Size of remote file: 2.63 MB
images/003_pose.png ADDED

Git LFS Details

  • SHA256: b16a83ca82b9b2f47091ec5317ad2e9f8f4d7dbe035158795adad202a12e33ab
  • Pointer size: 130 Bytes
  • Size of remote file: 79.2 kB
images/004.png ADDED

Git LFS Details

  • SHA256: ea1e2c9f3e77a7ff8a37c360d597ecc8bb8f29ff8305cc054848b24a77563eef
  • Pointer size: 132 Bytes
  • Size of remote file: 2.62 MB
images/004_pose.png ADDED

Git LFS Details

  • SHA256: 318c51ac29ad5c518c33394872794b02fbebde7fd2c97c636647841224050f6a
  • Pointer size: 130 Bytes
  • Size of remote file: 27.5 kB
show.jpg ADDED

Git LFS Details

  • SHA256: 4a5952277982eee417550000503416adfeec3f3aba1e2f9f7d9d51ee94fa0b6d
  • Pointer size: 132 Bytes
  • Size of remote file: 1.16 MB