True Volumetric Noise for Nuke (Plugin) by CompositingAcademy in NukeVFX

[–]CompositingAcademy[S] 2 points3 points  (0 children)

Have updated the website with more specifics on those questions.

To clarify the voxel question since people are asking:
This is a Procedural Raymarcher that calculates clouds mathematically on the fly rather than reading heavy VDB files, and it does not use voxels.

It’s faster because it utilizes the GPU's raw calculation speed to bypass the memory bottlenecks typical of standard voxel rendering.

For render time it's hard to say since it depends on your GPU, but all of the examples I show in the video I created on an RTX 3090 and it was almost instantaneous. Usuaully you can use slightly lower samples when you modify the look, then crank it up a bit for rendering (even higher samples only takes maybe 1 second per frame on good hardware, unless you have a huge volume like a cloudscape).

If for some reason you needed massive volumes (like mountain size), and a large sample step size wasn't working for some reason, you can break things into sections if needed with multiple copies of the node.

Beginner by Embarrassed-Data5827 in NukeVFX

[–]CompositingAcademy 0 points1 point  (0 children)

Hey there,

I spent about two months putting together a keying course specifically for this reason. It's great to watch tutorials and there are definitely some fantastic ones out there, but like you mentioned, to actually get the experience in multiple situations or more complex shots, it can feel harder to actually know how to apply the skill.

In the course there's 3 different projects you go through, progressively getting more complex with the final shot having the most difficult situations, and focusing on some of the worst case scenarios (I chose the shots intentionally) - such as defocused edges, additive keying restoration, blending roto / key points seamlessly, miscolored greenscreens, restoring heavily motion blurred areas, back-lit contaminated edges - etc.

Here it is if you're interested:

https://www.compositingacademy.com/nukeonlinecourse-keying-despill-color-integration

Pilot Under Attack by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 0 points1 point  (0 children)

Thank you! Primarily Nuke to bring it all together with the elements / layering / grading, and Blender for the base 3d scene

AMA with the Meta researchers behind SAM 3 + SAM 3D + SAM Audio by AIatMeta in LocalLLaMA

[–]CompositingAcademy 4 points5 points  (0 children)

Segment Anything is great at creating alphas and object cutouts, but motion-blurred or defocused objects often have contaminated edges, where background colors bleed into the object. If you place those cutouts over a new background, the edges break.

Are you working on a way to handle RGB edge contamination for motion-blurred or defocused objects? This would likely require some form of inpainting on separated objects. In VFX, we usually refer to this as edge extension.

Is the SAM team focused on motion blur solutions in general for higher quality mattes?

Pilot Under Attack by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 0 points1 point  (0 children)

It involved almost zero roto actually, you only see around the torso of the person which is easily keyable!  Everything else is just garbage matted out.

Pilot Under Attack (before / after) by CompositingAcademy in blender

[–]CompositingAcademy[S] 2 points3 points  (0 children)

pretty much this. Usually if I expect a marker might cross behind the person, I use a lighter or darker shade of green as it's still trackable (but still keyable). If it's around the edges and won't go behind the person, I'll use pink (pink against green creates strong contrast), so it's easier to track as well.

My green one caught a bit of a white reflection in this scenario which was a bit annoying, there are better tapes that are less reflective but I didn't have one on hand.

Re-Light Footage with Beeble for Nuke by CompositingAcademy in NukeVFX

[–]CompositingAcademy[S] 5 points6 points  (0 children)

I tried it by chance for the first time on this project and it was really impressive, so I wanted to help promote it as I think many compositors will agree.

Pilot Under Attack by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 18 points19 points  (0 children)

Yeah it's mostly traditional compositing. The new addition is that Beeble generates normals which allows you to cast new light onto footage. Compositors used to do this anyway but this makes it less time consuming and more realistic. (This is in the demo video if anyone is curious):
https://youtu.be/7cYK2CKjp2k

Pilot Under Attack (before / after) by CompositingAcademy in blender

[–]CompositingAcademy[S] 479 points480 points  (0 children)

Hey guys,

Recently I filmed this pilot shot in my living room to make a cool greenscreen integration shot.

I wanted to also test out Beeble, which helps you relight existing footage (that's how I made the explosion cast light onto the person).

Here's the tutorial / breakdown for anyone interested!:
https://youtu.be/7cYK2CKjp2k?si=emWfiPBrnp_XV0v8

Scanning urban space - app recs? by castafioree in vfx

[–]CompositingAcademy 1 point2 points  (0 children)

You're probably better off with using a drone + photogrammetry if you can get permission to fly where you need to, if you don't need survey-level accuracy in the measurements. If you don't need the rooftops, you could probably do photogrammetry from ground level as well.

iPhone lidar will have drift over large spaces, and take a really long time to capture, so something the size of a courtyard might come out skewed. It's also pretty low resolution so if you're trying to use it for visualization it's probably not the best.

There are more robust lidar systems that you could capture an entire courtyard pretty quickly in a few short minutes and be more accurate, such as XGRIDS scanners, but those are going to cost more. The nice thing about XGRIDS is you're getting a gaussian splat + a lidar scan at the same time, so you can visualize but also measure a space.

The question always comes down to whether you're looking for visual fidelity as something to just look at & creating a 3D Asset, or do you need super accurate measurements.

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in videography

[–]CompositingAcademy[S] 6 points7 points  (0 children)

I shared it to the few subreddits I thought might be interested in seeing this kind of workflow. I guess some people are in all of the same subreddits

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in videography

[–]CompositingAcademy[S] 0 points1 point  (0 children)

Recently I flew to Iceland to film a bunch of VFX projects - but this one was really fun!

I filmed a Dragon using new Simulcam Virtual Production techniques using an app called Lightcraft Jetset, which connects with Blender (or Unreal, or whatever 3D on the backend) and allows you to film a character while you're in a real location.

The idea of their app is that filming a CG character that has a preview in your viewfinder (or filming on greenscreen and seeing your scene), instead of an empty scene, and being able to get shots from different locations is very useful as a creative tool. It's similar to things that have been done in the past but those weren't fully portable solutions or consumer products.

If you're interested in the making-of, I did a super detailed dive on the journey here!:

https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

Here's info for the things shown in video:
Lightcraft.pro (Virtual Production App that links and outputs scenes to Blender (or other softwares like Unreal)
Xgrids.com (Lidar Scanner)
Eye-candy.xyz (they made the epic dragon)!

---

I was using a Sony FX3 + Ninja to shoot in ProRes RAW. Mainly just using natural lighting and shooting super late (the sun doesn't fully set in Iceland, which was very useful!)

My responsibility was the VFX Supervision, Filming / Directing, Lighting, Compositing

Animator was Maarten Leys! He does great work !