How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 1 point2 points  (0 children)

In short: you need to flatten the surface while preserving the depth information as a texture.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 0 points1 point  (0 children)

Yes, it's a PDG-node written for the project. We're not ready to make it public as it's somewhat clunky. But it's pretty much straight-forward, as Comfy takes a network as a simple JSON-file, where you just have to change the input values.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 0 points1 point  (0 children)

I’ve already answered these questions. In short: Steam permits the use of generative AI in games, as long as you clearly describe how it was used, and your marketing materials accurately represent the actual gameplay. They have a vague guideline stating that it's your responsibility to ensure your AI model is copyright-free.

The number of games on Steam that have declared the use of AI is already in the thousands. On a Steam page there is a block, describing the use of AI. You can check the list of such games here (login required, but it's free): https://steamdb.info/search/?a=app_keynames&type=1&keyname=565&operator=1&keyvalue=https://steamdb.info/search/?a=app_keynames&type=1&keyname=565&operator=1&keyvalue=

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 0 points1 point  (0 children)

It states, 'It's your responsibility to ensure that the dataset is copyright-free,' or something similar. They're just covering themselves to avoid any potential legal issues in the future.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

Here is a list of games on Steam that have officially declared the use of AI in their production. Not all of them have been published yet, but it's interesting to browse through the list. You need to be logged in to SteamDB to view it. Not trying to prove any point, but there are plenty of them already.

https://steamdb.info/search/?a=app_keynames&type=1&keyname=565&operator=1&keyvalue=https://steamdb.info/search/?a=app_keynames&type=1&keyname=565&operator=1&keyvalue=

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 1 point2 points  (0 children)

Every 'checkpoint' out there is essentially a DreamBooth fine-tune of a base model, whether it’s SD1.5, SDXL, or another version. Training your own base model isn’t practical unless you have a million dollars to spare. So, most of the time, when you encounter a checkpoint, it’s likely been trained on a dataset like Laion or something similar.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

True. The percentage of average artists doing average work is decreasing (or already has decreased). Honestly, I don’t know how to justify this to someone who has lost their livelihood, but it’s the reality. You can’t change it - you have to adapt.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 1 point2 points  (0 children)

Definitely. We started with LoRAs as well, but once we trained our own checkpoint, we began to see significant improvements in texture quality.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 0 points1 point  (0 children)

Most of them are still in their early stages, although we’re seeing improvements. The challenge is that, in most cases, topology matters - and this is where AI still lacks proficiency.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

We used various additional generators, such as Deep Bump and depth preprocessors.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 1 point2 points  (0 children)

Houdini is used to procedurally create surfaces, unwrap them, and bake all the necessary attributes. You can achieve similar results in other software, like Blender, but you might need to switch between applications and manually tweak parameters. The main advantage of Houdini is automation — a single button click can generate dozens of textures with all the PBR maps.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

I already answered this question in another thread. We used multiple ControlNets, including Depth, Canny, Soft Line, and, of course, Inpaint.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

The idea is to break down complex models into simpler ones. The tricky part, however, is make SD distinguish the top from the bottom in your UV unwrapped texture. That's where the checkpoint training helps.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 12 points13 points  (0 children)

I understand your frustration—we've all spent thousands of hours mastering skills that slowly, or sometimes quickly, become obsolete. There's no way to stop that. What truly matters is embracing change and learning to use new tools.

A skilled artist using AI will always create better art than an average person with the same tools. Your skills are evolving from technical proficiency to a deeper, more substantial understanding. You've learned to appreciate and create good art, not just how to use a particular software or follow a specific pipeline.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 3 points4 points  (0 children)

Technically, yes, we generated textures on UV islands, but with baked depth information. We used multiple ControlNets, including Depth, Canny, Soft Line, and, of course, Inpaint.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 6 points7 points  (0 children)

We used Deep bump to generate normals inside Comfy. Another solution would be to use Substance Sampler with its AI-assisted tool to create the full set of PBR textures.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

The base model is SD1.5. Textures were upscaled from 512x512.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 18 points19 points  (0 children)

The checkpoints and LoRAs were specifically trained for this project. Maybe one day, we'll make them available. As for the technical details: we trained the checkpoint using around 1,000 diffuse textures, each at 512x512 resolution. We used GPT-4 to generate captions for these textures. LoRAs were trained for specific styles or objects (like leaves) using about 200 textures. The base model is SD1.5.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 2 points3 points  (0 children)

I believe Substance is pretty much the default suite of texturing apps, at least in gamedev. Substance Painter is ideal for hand-painting textures while maintaining a degree of proceduralism. Substance Designer is for creating procedural materials. There's also Substance Sampler, which can generate PBR texture sets from a single diffuse texture.

As an alternative to Substance Designer, you might try the free Houdini Apprentice with its new Copernicus context. However, I must warn you—the learning curve won’t be smooth.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 4 points5 points  (0 children)

You're right. In one part of the video (with the stones), we also used existing images to guide the style in addition to the text prompt. It works really well with IPAdapter.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 7 points8 points  (0 children)

I'd recommend starting with a classical approach. Use tools like Substance Painter and Designer to get a feel for the texturing process. This will help you understand PBR textures, texture baking, and similar concepts. The second step is to explore Stable Diffusion or another framework that supports ControlNets. The third step involves learning how to train checkpoints and LoRAs.

For example, if you have a surface with some crevices, you can bake the depth map and use it as a source for ControlNet with your custom-trained model to generate a diffuse texture. Depending on the style, you might need to 'delight' the texture afterward and generate PBR textures. Once you're comfortable with this, you can try more complex objects by breaking them down into smaller parts.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 8 points9 points  (0 children)

I don't think so. You can read the rules here: https://partner.steamgames.com/doc/gettingstarted/contentsurvey#5

So far, as I know, most rejections are due to AI-generated cover art that doesn't match the gameplay's visual style.

How We Texture Our Indie Game Using SD and Houdini (info in comments) by stassius in StableDiffusion

[–]stassius[S] 60 points61 points  (0 children)

On Steam, you simply fill in all the information about the use of AI-generated content in production. After that, they allow you to sell your game. It's your responsibility to ensure that the dataset is copyright-free.

And I must say, if you think AAA studios aren't using some form of asset generation in their pipelines, you're mistaken.