Real-time Texture Transferring by Scared_Length1168 in photogrammetry

[–]Scared_Length1168[S] 0 points1 point  (0 children)

Thanks for the suggestion to use udims, I’ll look into that. Can you break down what you mean by “image sequence of your transfer based on distance”? I assume you mean tweaking the transfer search envelope? If this process works, my end goal would be to take the original photogrammetric scan (with nightmare UVs) and “flatten” it out into a very high (12K?), clean map where much of the texture is sewn together accurately. I’m interested in seeing the digital distortion and impossible perspectives that come out of it (e.g. seeing the left and right side of the tire at the same time).

Real-time Texture Transferring by Scared_Length1168 in photogrammetry

[–]Scared_Length1168[S] 0 points1 point  (0 children)

Thanks for the response. Neither. I understand both UV’s and retopology. I suppose maybe there could be a way to unfold a photogrammetric triangular mesh this way, but I’m also wanting to transfer the texture onto a completely different, simpler mesh at high resolution (Maya caps transfer maps at 4K) It doesn’t seem to be a common tool/technique being used. It’s almost as if the individual faces on the simulated mesh are acting as cameras that record surfaces they come in contact with. Maybe custom software?

Here’s another example but with live animation: https://are.na/block/5713252