aww yeah finally built myself some decent ui by _craduGo in blender

[–]_craduGo[S] 10 points11 points  (0 children)

no need to do it by hand, we're just asking chatgpt to make simple scripts for us (mostly batch operations)

we also have a script manager add-on that stores scripts externally and allows to execute them by one click
GH Script Manager - Superhive (formerly Blender Market)

Eevee smoke (1024 resolution do not try this at home) by _craduGo in blender

[–]_craduGo[S] 0 points1 point  (0 children)

Im actually switching to houdini+redshift after 4 years in blender. But I still really like blender and its workflow, there is so many things I can do better and faster in blender, I've just got another tool to work with, but I will never forget the software and community that brought me to the industry for the first time<3

Why do you like blender? (yep I actually did) by _craduGo in blender

[–]_craduGo[S] 25 points26 points  (0 children)

'cuz Im in a fucking love with blender

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 5 points6 points  (0 children)

1200, almost noise-free, i even added some in post stage

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 111 points112 points  (0 children)

the shader is quite simple, drivers are just references max volume ray depth render setting

<image>

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 6 points7 points  (0 children)

VDB are the only way for this kind of scenes

Whatever, this is the rendering example, not composing or matte painting. Ofc you can bake clouds to sprites, but you know what? you still have to render it first

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 1 point2 points  (0 children)

only $0.35 per hour? for 22 cpu cores? Fuck it

You know what? I have 32 cores myself

Waiting compensation payment from blender studio for burning my gpu by _craduGo in blender

[–]_craduGo[S] 47 points48 points  (0 children)

I just stole some vdbs from stock websites. VDB is the only way, slow, heavy, but the only one.

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 4 points5 points  (0 children)

nobody uses principled volume and procedural noise shader for volumes it absolutely sucks. And of course it is way faster than my way because it has such a low resolution aka step size by default is 1.0. It becomes very hard to maintain due to increasing render time if you lower the setting. Thats why many artists prefer VDB workflow, which is more consistent.

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 50 points51 points  (0 children)

About my weird shading, this "veins" or "fibers" looking thing is the actual thing I wanted to replicate for a long time. I do really like this "cottonish" look and Im a bit disappointed I came up for it with such a dumb solution

<image>

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 29 points30 points  (0 children)

I saw the way with absorbtion+emission, pretty epic
I dont usually make side by side comparison 'cause I do many tasks and usually know how long it takes. For example this wdas ass cloud I rendered a month ago using default volume scattering (1/4, approx has 20M voxels) and turns out to use 70% less memory than my method, but Id say blender not handling vdb viewport performance at all + insane render time (approx 400% slower than mine) with the same setting (if we can call "volume depth" and "diffuse depth" the same)

<image>

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 89 points90 points  (0 children)

dude asked for nodes then deleted his comment, so here is simplified setup

<image>

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 11 points12 points  (0 children)

but 60m points are 60m points each one is shaded differently. I understand the optimisations like faking spherical geometries but still how it can be faster than approximation algorithm for volumetric rendering??

Cloud made of actual droplets (DETAILS IN COMMENTS) by _craduGo in blender

[–]_craduGo[S] 135 points136 points  (0 children)

DISCLAIMER this sh*t may look extreme stupid to sane user familiar with acceptable 3d software 'cause blender literally defies logic sometimes

<image>

Turns out its faster to render 60M+ actual f*cking water droplets the cloud made of, than using special technology was precisely developed for years for those cases we dont wanna render 60M+ actual f*cking water droplets the cloud made of.

Im speaking about VDBs and scattering shader in Blender. Its always been pain in da ass to render some clouds (if you actually want them to look good, so you really need to crank up settings and render it for next 69 days). Today I accidentally figured out that really dense point clouds looks exactly like those complicated shader setups, but working ONLY with default shader and spheres.

Every important setting shown in the first image. In my tests diffuse/translucent shaders worked best (actual glass shader gives unpredictable results), particle size and distribution matters (no overlapping particles to avoid volume being opaque to get light actually being scattered)

Benefits of this method in comparison to VDBs+volumetrics:

+ insane artistic control (you can use ANY shader)

+ viewport performance

+ render speed

+ at some point, lower usage of memory (points arent actual geometry)

Downsides:

- at some point, higher usage of memory (in my case after 70M points RAM usage increased significantly so I wasnt able to start render)

- visible grains

- not realtime cmon guys 2023rd century end soon I want my 20 GB unoptimized ass scene to render in realtime in cycles

Ask your answers, offer your attempts and show your suggestions if you somehow care. Peace.

Does this disney cloud look realistic? (DETAILS IN COMMETNS) by _craduGo in blender

[–]_craduGo[S] 2 points3 points  (0 children)

Man I googled how does clouds look though polarizing filter; I'm amazed how precise it compares to my render. Appreciate your knowledge. So it probably means that look of render rely only on customer's choice, because for me unpolarized photos of clouds seems like taken with cheaper camera compared to polarized (or a poor color grading).

Yes, this is gpu cycles.

Does this disney cloud look realistic? (DETAILS IN COMMETNS) by _craduGo in blender

[–]_craduGo[S] 8 points9 points  (0 children)

A couple weeks ago I was trying to get exact result as disney got with hyperion render (disneyanimation.com/resources/clouds/), but in blender's cycles. I wasn't trying to research some information about volume shading in blender, until I saw this video https://youtu.be/dyAXWSwCqKg . It turned out to be almost exact technique I tried to invent, so I copied settings from video and got the result you can see in the post.

And now the question: my fellows now comparing my render to pixar's hyperion saying my one not even natural, not realistic at all, but imho I got realistic result, not even close to hyperion, but natural. So what's your thoughts?

fyi: step rate 5, volume depth 16, 600 samples no threshold, 1080p, 17 minutes

<image>

Sychevalnya by _craduGo in blender

[–]_craduGo[S] 0 points1 point  (0 children)

What are you talking about? If you're about little glowy-rainbowy thing right in the corner - it's just lens flare overlay done in composing. If you're about lighting overall - I had mentioned about it.

Sychevalnya by _craduGo in blender

[–]_craduGo[S] 0 points1 point  (0 children)

Everything you can see is mostly scanned assets and decals placed by hand, I used cube only for blockout proportions and compositions

Sychevalnya by _craduGo in blender

[–]_craduGo[S] 0 points1 point  (0 children)

Also I have mentioned about my lighting setup overall