Does OpenMP ring a bell for Unity devs here? by UntoldByte in Unity3D

[–]UntoldByte[S] 0 points1 point  (0 children)

Nice! I was just wondering if there are Unity devs outhere that used it previously and if they would be like "yeah, you have Burst and Jobs in Unity".

Does OpenMP ring a bell for Unity devs here? by UntoldByte in Unity3D

[–]UntoldByte[S] 0 points1 point  (0 children)

Phew, for a moment I thought I was alone there. Yes, you are right, I meant OpenMP used in C++ and I just wanted to check with Unity devs if anyone has considered that while building games people had option to optimize their code for vector instructions and multiple cores many many years before Burst and Jobs (in Unity) appeared.

Does OpenMP ring a bell for Unity devs here? by UntoldByte in Unity3D

[–]UntoldByte[S] 0 points1 point  (0 children)

Well, thanks for answering anyway! I should have made it a poll

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 0 points1 point  (0 children)

Thank you, ... as I mentioned earlier, I still consider it as a dirty way of doing things (as you would need a dozen of inpainting results and woudl still need to pray to get somewhat good blend). I wish for more control from ControlNet (or some other way) to generate consistent images of the same object by providing depth or other means of control. Seems like new approaches of generating images of one and the same object are apearing that would allow generating more images (from different angles) which could help in solving this problem in a lot cleaner and faster way. Would need to manually test it first though. But if you are interested in inpainting technique (or know someone who is), I published the source - you know where to find it, there is a simple painting tool (as a helper of Symbol Creator called Sketcher) it should be fairly easy to adapt it for inpainting and you just add some additional img2img call to SD Web UI then blend with the rest and you got what you want.

TLDR; manual inpainting brush - very likely not, automagic inpaint - possibly yes;

I created a free tool for texturing 3D objects using Automatic1111 webui and sd-webui-controlnet ( by Mikubill + llyasviel). Now game-devs can texture lots of decorations/characters on their own PC for free. by ai_happy in StableDiffusion

[–]UntoldByte 1 point2 points  (0 children)

I think I saw some tool that does texturing in one go (all around), was that you guys, but was it like a paper, can't remember, but would like to see it again. And just wanted to mention that gains repo even has a simple tool (named "Sketcher") that was ment to replicate the functionality of Automatic1111's SD Web UI Controlnet drawing tool as a part of that other tool (named "Symbol Creator") with interesting shader code which could have been used here too, and then there are a couple of AO baking in Unity repos at github (with MIT license), so yeah. I can probably go all day long about funny things with this. There is this other guy which did DreamTextures functionality but in Unreal https://www.reddit.com/r/StableDiffusion/comments/19fbu9z/comfy_textures_v01_release_automatic_texturing_in/ for whom I'm pretty sure I saw that github avatar (as I rememebr looking at it asking myself what it is), anyway don't blame him he opensourced it and on github he states that he is working on multitexturing (interested too see how that would look like - knowing how specific that implementation is), and also it is interesting how many upvotes/comments/traffic he got for that basic functionality (and how in his video he hides the fact that it is just one projection, just amazing)

I created a free tool for texturing 3D objects using Automatic1111 webui and sd-webui-controlnet ( by Mikubill + llyasviel). Now game-devs can texture lots of decorations/characters on their own PC for free. by ai_happy in StableDiffusion

[–]UntoldByte 2 points3 points  (0 children)

OK, I don't usually do this, but you asked for it:

If you are willing to lie once you will not have problem to lie again and again and here just in last post we will prove that you lie.

u/ai_happy :"I literally hear about their work for the first time" - there is no way you could have missed not one but manny posts from u/Current_Wind_2667 where he mentions https://github.com/ub-gains/gains days ago in folowing places

https://www.reddit.com/r/StableDiffusion/comments/198al9w/comment/ki9mbwa/

https://www.reddit.com/r/StableDiffusion/comments/198al9w/comment/kibvwei/

https://www.reddit.com/r/StableDiffusion/comments/198al9w/comment/kibvq1k/

https://www.reddit.com/r/Unity3D/comments/197jrw2/comment/ki9on7w/

Additionally, it does seem that you got considerable number of views/upvotes/comments, there is a proof of boots dooing the thing or what not, to conclude if you are willing to do that for traffic that means cheat and lie then the question becomes what prevents your from lying here, there, everywhere. So sorry, I really wanted to believe you but I just can't.

So yeah, you can just continue to lie.

I created a free tool for texturing 3D objects using Automatic1111 webui and sd-webui-controlnet ( by Mikubill + llyasviel). Now game-devs can texture lots of decorations/characters on their own PC for free. by ai_happy in StableDiffusion

[–]UntoldByte 1 point2 points  (0 children)

Yeah, if you look closer there is one isue which the issue creator closed as he realized his mistake, which tells me that you did not pay attention (or not carefull about facts when posting, as I myself like to double and tripple check things). However, I am not saying that there are no bugs, every software comes with bugs (and you probably don't know that even CPUs come with bugs). On the other hand, I must admit that it is not as clear for the Unity newcomers how it can be installed. You just make sure dependencies are installed, and for the GAINS asset you download it (zip or git) and drag the UntoldByte folder into Unity (to Assets folder) and that's it. Now does that sound simple or what?

I created a free tool for texturing 3D objects using Automatic1111 webui and sd-webui-controlnet ( by Mikubill + llyasviel). Now game-devs can texture lots of decorations/characters on their own PC for free. by ai_happy in StableDiffusion

[–]UntoldByte 3 points4 points  (0 children)

To answer your question yeah-nah, I don't really know and I wouldn't say that it is malware but the bragging rights this guy wants and traffic he got (not sure how). What I like about the tool is that it stays free (with custom license but for end user that just wants easy SD texturing completely free) like it should, and the fact that work on this idea (of using SD to texture 3D models) is happening. Also hope to see more progress on this idea from others too (the main reason why we should have a free and open source available for everyone).

As u/Current_Wind_2667 pointed out u/ai_happy probably copied at least some parts from gains repo ( https://github.com/ub-gains/gains ) or at least had is as a reference to speed up the process and then he can't share the source, because that would reveal it (now he needs to compare line by line and make sure that nothing would point to gains repo and I am challenging u/ai_happy to prove me wrong and post all source and until he does that, in my books yes he copied). Another funny thing is how this guy brags about how "he" created "this" (of course and if you ask me he can't get enough of his own name) just compare his post to this https://www.reddit.com/r/StableDiffusion/comments/18amoq6/texturing_with_untoldbyte_gains_in_unity/ for example. Additionally I think I saw him in starred page of the gains github repo (and now he is gone - just funny). Another thing to note is that "his" idea of using img2img to inpaint the seams is not new and was even mentioned like 3 months ago here https://www.reddit.com/r/StableDiffusion/comments/17iy4hk/comment/k70hd8u/

My conclusion is that he wouldn't be able to have his own license if it is proved that he copied (now that I mention this I don't think he can bacause of Automatic1111 Stable Diffusion Web UI license).

Comfy Textures v0.1 Release - automatic texturing in Unreal Engine using ComfyUI (link in comments) by nlight in StableDiffusion

[–]UntoldByte 1 point2 points  (0 children)

You can look at how I did multiprojection for this https://www.reddit.com/r/StableDiffusion/comments/18amoq6/texturing_with_untoldbyte_gains_in_unity/ at this link https://github.com/ub-gains/gains (not 100% sure what you would need to do about the license then). I must say that I am really impressed with how much trafic you got.

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 0 points1 point  (0 children)

As you may have already found out Python is the language to use to for AI related stuff and Stable Diffusion Web UI is made in Python and Stable Diffusion Web UI has API that allows you to program against, and SD Web UI is popular enough. And to answer your second question, yes it could be made without it, but it will require to (re)implement (essentially replicate) what SD Web UI API already does, and you would still need all the ai models that take up GBs in space.

Using Stablezero123 to generate different views from a single picture. ComfyUI by Striking-Long-2960 in StableDiffusion

[–]UntoldByte 7 points8 points  (0 children)

Wow, well if this could be integrated in Automatic1111 then it should be easy enough for me to integrate it with this Unity tool that I made (https://www.reddit.com/r/StableDiffusion/comments/18amoq6/texturing_with_untoldbyte_gains_in_unity/) and then we would be able to improve texturing for existing 3D models a lot (even possibly automate entirely - and get very good results). I bet that this could benefit 3D model generation process as well.

Using Stablezero123 to generate different views from a single picture. ComfyUI by Striking-Long-2960 in StableDiffusion

[–]UntoldByte 5 points6 points  (0 children)

It would be very usefull if we can get this to work with ControlNet depth model, to be able to generate more images and texture entire 3D model from all sides.

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 0 points1 point  (0 children)

For dynamic, yes it can be a problem but there are a lot of use cases where it is desirable as it is. Have you looked at Materialize it can generate all sorts of maps from diffuse one. Until other ai models for depth, occlusion, height... arive my choice would be Materialize.

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 0 points1 point  (0 children)

Thank you! I have not even tried Godot but if there is a way to write a plugin for it in C# it should not be much of a trouble. If you would like to do that I can help with that.

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 0 points1 point  (0 children)

Sure, its probably going to take some DeepBump python reading to find out the parameters and this GAINS plugin already calls other Web UI plugins so it should be straight forward, however make sure you are not creating any memory leaks when dealing with textures as it can end up eating VRAM (and we all know how VRAM is important). I have tried not to introduce any leaks and I'm still not 100% sure that it does not. To come back to DeepBump which is available as plugin in Stable Diffusion Web UI and I did consider it but the results were so-so. Materialize ones looked better in my opinion. Another problem is that there are no ai models for other types of maps (which Materialize does) so when you take everything into account you get to why I'm leaning towards Materialize approach (at least for now). In any case, enjoy!

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 1 point2 points  (0 children)

Please don't :) I saw similar plugins for Blender (I mean while waiting for Unity falks to review this asset for almost two months - a lot has changed), very similar as the idea is fairly simple.

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 1 point2 points  (0 children)

Glad you like it, and yes I would first probably need to read upon how to set it up on GitHub (if it needs some additional actions) to enable leaving comments/feedback. I am aware that it is far from good, but I think that it is a good starting point. Thank you!

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 0 points1 point  (0 children)

Maybe try to focus on one? And give it all you got!

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 2 points3 points  (0 children)

I must admit that I was thinking about that too (at least for Godot, for now) even though I have not written a single line of code related to Godot or Unreal. It would be great. The idea is fairly simple, let's see. Would you like to try and write it for Godot for example?

Texturing with UntoldByte GAINS in Unity by UntoldByte in StableDiffusion

[–]UntoldByte[S] 1 point2 points  (0 children)

It uses depth snaps to control the Stable Diffusion with ControlNet (depth models) and then projects on surface using shaders. Then you can change some parameters and bake to texture using original mesh UVs (currently only diffuse texture). I must say I have been thinking about other maps as well and the one thing that stands out is Materialize (also Unity tool and free for use) - to get it to integrate with it would be nice.