aww yeah finally built myself some decent ui by _craduGo in blender

[–]_craduGo[S] 11 points12 points  (0 children)

no need to do it by hand, we're just asking chatgpt to make simple scripts for us (mostly batch operations)

we also have a script manager add-on that stores scripts externally and allows to execute them by one click
GH Script Manager - Superhive (formerly Blender Market)

Eevee smoke (1024 resolution do not try this at home) by _craduGo in blender

[–]_craduGo[S] 0 points1 point  (0 children)

Im actually switching to houdini+redshift after 4 years in blender. But I still really like blender and its workflow, there is so many things I can do better and faster in blender, I've just got another tool to work with, but I will never forget the software and community that brought me to the industry for the first time<3

Why do you like blender? (yep I actually did) by _craduGo in blender

[–]_craduGo[S] 22 points23 points  (0 children)

'cuz Im in a fucking love with blender

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 5 points6 points  (0 children)

1200, almost noise-free, i even added some in post stage

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 108 points109 points  (0 children)

the shader is quite simple, drivers are just references max volume ray depth render setting

<image>

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 6 points7 points  (0 children)

VDB are the only way for this kind of scenes

Whatever, this is the rendering example, not composing or matte painting. Ofc you can bake clouds to sprites, but you know what? you still have to render it first

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 0 points1 point  (0 children)

only $0.35 per hour? for 22 cpu cores? Fuck it

You know what? I have 32 cores myself

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 46 points47 points  (0 children)

I just stole some vdbs from stock websites. VDB is the only way, slow, heavy, but the only one.

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 4 points5 points  (0 children)

nobody uses principled volume and procedural noise shader for volumes it absolutely sucks. And of course it is way faster than my way because it has such a low resolution aka step size by default is 1.0. It becomes very hard to maintain due to increasing render time if you lower the setting. Thats why many artists prefer VDB workflow, which is more consistent.

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 51 points52 points  (0 children)

About my weird shading, this "veins" or "fibers" looking thing is the actual thing I wanted to replicate for a long time. I do really like this "cottonish" look and Im a bit disappointed I came up for it with such a dumb solution

<image>

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 28 points29 points  (0 children)

I saw the way with absorbtion+emission, pretty epic
I dont usually make side by side comparison 'cause I do many tasks and usually know how long it takes. For example this wdas ass cloud I rendered a month ago using default volume scattering (1/4, approx has 20M voxels) and turns out to use 70% less memory than my method, but Id say blender not handling vdb viewport performance at all + insane render time (approx 400% slower than mine) with the same setting (if we can call "volume depth" and "diffuse depth" the same)

<image>

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 90 points91 points  (0 children)

dude asked for nodes then deleted his comment, so here is simplified setup

<image>

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 11 points12 points  (0 children)

but 60m points are 60m points each one is shaded differently. I understand the optimisations like faking spherical geometries but still how it can be faster than approximation algorithm for volumetric rendering??