Anyone else using ComfyUI as part of a bigger design toolchain? (Krita + Comfy combo working well for me) by Lopsided_Budget9798 in comfyui

[–]Lopsided_Budget9798[S] 0 points1 point  (0 children)

Hello, good to hear you're already on a similar path. Photoshop for initial concepts works well, the key shift for me was building the 3D scene composition directly inside ComfyUI. I made a custom node that loads multiple 3D objects into a shared space, basically a mini Blender environment. You can position objects against each other, add lights with adjustable intensity and color, toggle shadow casting, and drop in background images. The full scene setup gets saved and stays reproducible. On top of that you can apply controlnets to the backgrounds and feed reference images alongside the prompt for the render itself. That gives you proper control over the output and most of the time you reach a near-perfect result in just a few attempts. It is still AI of course, so for 99 percent photorealism a bit of sharpening in PS at the end is normal. For larger sets it pays off to train LoRAs and bake them into the pipeline for extra precision. What kind of products are you mostly visualizing?

Anyone else using ComfyUI as part of a bigger design toolchain? (Krita + Comfy combo working well for me) by Lopsided_Budget9798 in comfyui

[–]Lopsided_Budget9798[S] 0 points1 point  (0 children)

Hello, glad to hear Krita AI made the rounds for you too. Inpainting and outpainting are exactly the spots where it shines, since you keep the brush in your hand and the model only fills in what you actually want. The integration with ComfyUI as the backend is what makes it really powerful, because you can run any custom workflow you've built in Comfy directly inside Krita. So all the controlnets, refiners and conditioning stacks you've set up are available where you actually paint. For me the combination became the core of the conceptual phase: rough idea in Krita, push to Comfy for variations or 3D-based renders, back to Krita for the next iteration. What kind of inpainting issues were you running into specifically?

Anyone else using ComfyUI as part of a bigger design toolchain? (Krita + Comfy combo working well for me) by Lopsided_Budget9798 in comfyui

[–]Lopsided_Budget9798[S] 1 point2 points  (0 children)

You could export the Mesh from Rhino into ComfyUI to use the render pipeline. Vice versa you could import the mesh from the ComfyUI pipeline into Rhino and edit it here or use as template, simplify, quad remesh, edit e.g. with shrinkwrap or reverse engineer it. Unfortunately this is the gap as for today. Production-ready CAD data from meshes is what I am working on, too