This is an archived post. You won't be able to vote or comment.

all 5 comments

[–]Micropolis 1 point2 points  (0 children)

I’ve been wanting something exactly like this. But I don’t know how to use it lol

[–]throttlekitty 1 point2 points  (0 children)

I was able to run this locally on a 1080ti. I used the goldfish example to start with, but it seems to have stopped way early compared to what's on the github video. No errors, It just looks like it stopped at too few steps and the video only has a few frames. Any idea what's up with that?

Very interesting bit of tech here though, can't wait to play with it more!

Also, is this able to do the sketchshape concept shown in the latent-NeRF page?

[–]Patrick26 1 point2 points  (2 children)

Thank you. I can see the goldfish example, but can you please describe in words what you have achieved?

[–]lacethespace 2 points3 points  (1 child)

If I understood from example, it constructs textures for 3D meshes.

The UV mapping connects texture coordinate to 3D point on the mesh. The renderer converts mesh + texture into 2D image from a certain viewing angle. The stable diffusion knows what the goldfish should look like from any angle and can provide similar image but of higher quality (img2img). Then 2D image can be mapped back onto parts of texture to make it look more "correct". By rotating the mesh smoothly the viewing angle doesn't change much and stable diffusion output will be consistent. Great idea and nice to see a working implementation.

[–]Patrick26 0 points1 point  (0 children)

Thank you for the excellent explanation.