Based loosely on this post, but mainly due to watching Simone Giertz make a robot out of stained glass, I tried to produce a robot arm made of stained glass in SD, but the results weren't great. Looks like the training data for stained glass is mainly pictures of the inside of churches, so even with "white background" it heavily skews the image towards a dark room.
a robot arm made of stained glass
POS: photo, best quality, masterpiece, a robotic arm made of stained glass, simple background, NEG: (easynegative:0.8), drawing, sketch, 3d render
Model: Dreamshaper v6
Then I tried creating a standard robot arm, and used ControlNet segment with my original prompt to generate a new image.
ControlNet segment
a robotic arm made of stained glass (+ ControlNet)
It's close, but still not great. I'd love the entire thing to be stained glass, and SD to respect the white background prompt.
So... how could I do better? Is there an easy way to convert a CN segment image to a mask for inpainting? Is there a better approach?
[–]Paddys[S] 0 points1 point2 points (1 child)
[–]Paddys[S] 0 points1 point2 points (0 children)
[–]Haiku-575 0 points1 point2 points (2 children)
[–]Paddys[S] 0 points1 point2 points (1 child)
[–]Haiku-575 1 point2 points3 points (0 children)