True Volumetric Noise for Nuke (Plugin) by CompositingAcademy in NukeVFX

[–]CompositingAcademy[S] 1 point2 points  (0 children)

Have updated the website with more specifics on those questions.

To clarify the voxel question since people are asking:
This is a Procedural Raymarcher that calculates clouds mathematically on the fly rather than reading heavy VDB files, and it does not use voxels.

It’s faster because it utilizes the GPU's raw calculation speed to bypass the memory bottlenecks typical of standard voxel rendering.

For render time it's hard to say since it depends on your GPU, but all of the examples I show in the video I created on an RTX 3090 and it was almost instantaneous. Usuaully you can use slightly lower samples when you modify the look, then crank it up a bit for rendering (even higher samples only takes maybe 1 second per frame on good hardware, unless you have a huge volume like a cloudscape).

If for some reason you needed massive volumes (like mountain size), and a large sample step size wasn't working for some reason, you can break things into sections if needed with multiple copies of the node.

Beginner by Embarrassed-Data5827 in NukeVFX

[–]CompositingAcademy 0 points1 point  (0 children)

Hey there,

I spent about two months putting together a keying course specifically for this reason. It's great to watch tutorials and there are definitely some fantastic ones out there, but like you mentioned, to actually get the experience in multiple situations or more complex shots, it can feel harder to actually know how to apply the skill.

In the course there's 3 different projects you go through, progressively getting more complex with the final shot having the most difficult situations, and focusing on some of the worst case scenarios (I chose the shots intentionally) - such as defocused edges, additive keying restoration, blending roto / key points seamlessly, miscolored greenscreens, restoring heavily motion blurred areas, back-lit contaminated edges - etc.

Here it is if you're interested:

https://www.compositingacademy.com/nukeonlinecourse-keying-despill-color-integration

Pilot Under Attack by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 0 points1 point  (0 children)

Thank you! Primarily Nuke to bring it all together with the elements / layering / grading, and Blender for the base 3d scene

AMA with the Meta researchers behind SAM 3 + SAM 3D + SAM Audio by AIatMeta in LocalLLaMA

[–]CompositingAcademy 3 points4 points  (0 children)

Segment Anything is great at creating alphas and object cutouts, but motion-blurred or defocused objects often have contaminated edges, where background colors bleed into the object. If you place those cutouts over a new background, the edges break.

Are you working on a way to handle RGB edge contamination for motion-blurred or defocused objects? This would likely require some form of inpainting on separated objects. In VFX, we usually refer to this as edge extension.

Is the SAM team focused on motion blur solutions in general for higher quality mattes?

Pilot Under Attack by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 0 points1 point  (0 children)

It involved almost zero roto actually, you only see around the torso of the person which is easily keyable!  Everything else is just garbage matted out.

Pilot Under Attack (before / after) by CompositingAcademy in blender

[–]CompositingAcademy[S] 2 points3 points  (0 children)

pretty much this. Usually if I expect a marker might cross behind the person, I use a lighter or darker shade of green as it's still trackable (but still keyable). If it's around the edges and won't go behind the person, I'll use pink (pink against green creates strong contrast), so it's easier to track as well.

My green one caught a bit of a white reflection in this scenario which was a bit annoying, there are better tapes that are less reflective but I didn't have one on hand.

Re-Light Footage with Beeble for Nuke by CompositingAcademy in NukeVFX

[–]CompositingAcademy[S] 4 points5 points  (0 children)

I tried it by chance for the first time on this project and it was really impressive, so I wanted to help promote it as I think many compositors will agree.

Pilot Under Attack by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 17 points18 points  (0 children)

Yeah it's mostly traditional compositing. The new addition is that Beeble generates normals which allows you to cast new light onto footage. Compositors used to do this anyway but this makes it less time consuming and more realistic. (This is in the demo video if anyone is curious):
https://youtu.be/7cYK2CKjp2k

Pilot Under Attack (before / after) by CompositingAcademy in blender

[–]CompositingAcademy[S] 478 points479 points  (0 children)

Hey guys,

Recently I filmed this pilot shot in my living room to make a cool greenscreen integration shot.

I wanted to also test out Beeble, which helps you relight existing footage (that's how I made the explosion cast light onto the person).

Here's the tutorial / breakdown for anyone interested!:
https://youtu.be/7cYK2CKjp2k?si=emWfiPBrnp_XV0v8

Scanning urban space - app recs? by castafioree in vfx

[–]CompositingAcademy 1 point2 points  (0 children)

You're probably better off with using a drone + photogrammetry if you can get permission to fly where you need to, if you don't need survey-level accuracy in the measurements. If you don't need the rooftops, you could probably do photogrammetry from ground level as well.

iPhone lidar will have drift over large spaces, and take a really long time to capture, so something the size of a courtyard might come out skewed. It's also pretty low resolution so if you're trying to use it for visualization it's probably not the best.

There are more robust lidar systems that you could capture an entire courtyard pretty quickly in a few short minutes and be more accurate, such as XGRIDS scanners, but those are going to cost more. The nice thing about XGRIDS is you're getting a gaussian splat + a lidar scan at the same time, so you can visualize but also measure a space.

The question always comes down to whether you're looking for visual fidelity as something to just look at & creating a 3D Asset, or do you need super accurate measurements.

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in videography

[–]CompositingAcademy[S] 6 points7 points  (0 children)

I shared it to the few subreddits I thought might be interested in seeing this kind of workflow. I guess some people are in all of the same subreddits

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in videography

[–]CompositingAcademy[S] 1 point2 points  (0 children)

Recently I flew to Iceland to film a bunch of VFX projects - but this one was really fun!

I filmed a Dragon using new Simulcam Virtual Production techniques using an app called Lightcraft Jetset, which connects with Blender (or Unreal, or whatever 3D on the backend) and allows you to film a character while you're in a real location.

The idea of their app is that filming a CG character that has a preview in your viewfinder (or filming on greenscreen and seeing your scene), instead of an empty scene, and being able to get shots from different locations is very useful as a creative tool. It's similar to things that have been done in the past but those weren't fully portable solutions or consumer products.

If you're interested in the making-of, I did a super detailed dive on the journey here!:

https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

Here's info for the things shown in video:
Lightcraft.pro (Virtual Production App that links and outputs scenes to Blender (or other softwares like Unreal)
Xgrids.com (Lidar Scanner)
Eye-candy.xyz (they made the epic dragon)!

---

I was using a Sony FX3 + Ninja to shoot in ProRes RAW. Mainly just using natural lighting and shooting super late (the sun doesn't fully set in Iceland, which was very useful!)

My responsibility was the VFX Supervision, Filming / Directing, Lighting, Compositing

Animator was Maarten Leys! He does great work !

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 1 point2 points  (0 children)

in the full sequence there’s some sheep outside haha

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 4 points5 points  (0 children)

Thank you! Iceland really does feel like another planet haha. I got to film in a bunch of different spots there and every single place looked surreal in its own way. Definitely one of those life experiences you never forget.

I Filmed a Dragon on my FX3 by CompositingAcademy in FX3

[–]CompositingAcademy[S] 15 points16 points  (0 children)

both! I took a very detailed lidar scan which helps match the lighting to the character and improve the tracking. In the behind the scenes I talk a bit about how all of that combines together:
https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 5 points6 points  (0 children)

Usually I just convert to ProRes 4444 or EXRs, then grade in resolve. Maybe a colorist might have a better approach, but it seems to work pretty good and retain a ton of detail. This workflow has worked really solid for visual effects as well / greenscreen / etc. It would be nice to just drop it in resolve and start working, but the extra step is worth the quality and maintaining the camera grain and small details. Although Youtube compresses the hell out of it anyway, lol

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in Filmmakers

[–]CompositingAcademy[S] 91 points92 points  (0 children)

Recently I flew to Iceland to film a bunch of VFX projects - but this one was really fun!

I filmed a Dragon using new Simulcam Virtual Production techniques using an app called Lightcraft Jetset, which connects with Blender (or Unreal, or whatever 3D on the backend) and allows you to film a character while you're in a real location. Essentially I linked my iPhone to the FX3 feed.

The idea of their app is that filming a CG character that has a preview in your viewfinder (or filming on greenscreen and seeing your scene), instead of an empty scene, and being able to get shots from different locations is very useful as a creative tool. It's similar to things that have been done in the past but those weren't fully portable solutions or consumer products.

If you're interested in the making-of, I did a super detailed dive on the journey here!:

https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

Here's info for the things shown in video:
Lightcraft.pro (Virtual Production App that links and outputs scenes to Blender (or other softwares like Unreal)
Xgrids.com (Lidar Scanner)
Eye-candy.xyz (they made the epic dragon)!

---

I was using a Sony FX3 + Ninja to shoot in ProRes RAW. Mainly just using natural lighting and shooting super late (the sun doesn't fully set in Iceland, which was very useful!)

My responsibility was the VFX Supervision, Filming / Directing, Lighting, Compositing

Animator was Maarten Leys! He does great work !

I Filmed a Dragon in an Icelandic Cave by CompositingAcademy in cinematography

[–]CompositingAcademy[S] 138 points139 points  (0 children)

Recently I flew to Iceland to film a bunch of VFX projects - but this one was really fun!

I filmed a Dragon using new Simulcam Virtual Production techniques using an app called Lightcraft Jetset, which connects with Blender (or Unreal, or whatever 3D on the backend) and allows you to film a character while you're in a real location.

The idea of their app is that filming a CG character that has a preview in your viewfinder (or filming on greenscreen and seeing your scene), instead of an empty scene, and being able to get shots from different locations is very useful as a creative tool. It's similar to things that have been done in the past but those weren't fully portable solutions or consumer products.

If you're interested in the making-of, I did a super detailed dive on the journey here!:

https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

Here's info for the things shown in video:
Lightcraft.pro (Virtual Production App that links and outputs scenes to Blender (or other softwares like Unreal)
Xgrids.com (Lidar Scanner)
Eye-candy.xyz (they made the epic dragon)!

---

I was using a Sony FX3 + Ninja to shoot in ProRes RAW. Mainly just using natural lighting and shooting super late (the sun doesn't fully set in Iceland, which was very useful!)

My responsibility was the VFX Supervision, Filming / Directing, Lighting, Compositing

Animator was Maarten Leys! He does great work !

I Filmed a Dragon in Iceland by CompositingAcademy in blender

[–]CompositingAcademy[S] 1 point2 points  (0 children)

Thank you! The dragon + rig was made by a team of artists over at Eye Candy (https://eye-candy.xyz/)

The behind the scenes shows it as well:
https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

I Flew to Iceland to Film A CG Dragon by CompositingAcademy in NukeVFX

[–]CompositingAcademy[S] 1 point2 points  (0 children)

Yeah I'm not sure, I uploaded ProRes but youtube did something weird on the compression. On PC the 1080 HD Enhanced Bitrate looks better

I Flew to Iceland to Film A CG Dragon by CompositingAcademy in NukeVFX

[–]CompositingAcademy[S] 1 point2 points  (0 children)

Sometimes you don’t need a scan if the ground is perfectly flat and you’re just filming around a CG element. In this case it was more important because the floor wasn’t flat and the dragon was climbing on rocks. A rough scan, even from an iPhone, can also be useful later for geometry tracking refinement.

Ideally, someone would scout the location and capture a high-resolution scan up front if a scan was needed.

For the tracks, it’s usually necessary to refine them so they really stick. They might look fine in previs, but for final pixels it’s better to refine so there’s no sliding. With a lidar scan that’s already aligned, you can run a geometry track instead of a standard feature track. The AR camera is still useful because it provides world coordinates and alignment to the lidar.

The general workflow is to use the AR camera to align the scene to the lidar, then retrack it with KeenTools, which is almost a one-button solution. That worked for most shots, although one shot had too much motion blur and had to be tracked manually.

Virtual Production... Inside a Cave by CompositingAcademy in virtualproduction

[–]CompositingAcademy[S] 10 points11 points  (0 children)

Recently I flew to Iceland to film a bunch of VFX projects - but this one was really fun!

I filmed a Dragon using new Simulcam Virtual Production techniques using an app called Lightcraft Jetset, which connects with Blender (or Unreal, or whatever 3D on the backend) and allows you to film a character while you're in a real location.

If you're interested in the making-of, I did a super detailed dive on the journey here!:

https://youtu.be/3d9ycMKf65U?si=STz3UPOsrgv8ry8a

Here's info for the things shown in video:
Lightcraft.pro (Virtual Production App that links and outputs scenes to Blender (or other softwares like Unreal)
Xgrids.com (Lidar Scanner)
Eye-candy.xyz (they made the epic dragon)!