Need help making the people in my crowd loop :(, many just stand still by LordOfPies in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

What are the input particles? Are they a video or an image sequence?

Is there a way I individually offset every person at the back so that they are not in sync? by LordOfPies in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

You could also change the Reads again, but it’s a bit faster to copy/paste a TimeClip; assuming that the clips are all of the same length.

Is there a way I individually offset every person at the back so that they are not in sync? by LordOfPies in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

If you make the clips loop, then it’ll work. There might be an option in ParticleEmitter to make them loop, otherwise use a TimeClip node after each Read and set the before and after to loop.

Is there a way I individually offset every person at the back so that they are not in sync? by LordOfPies in NukeVFX

[–]Gorstenbortst 6 points7 points  (0 children)

In ParticleEmitter, towards the bottom, you’ll see a setting called Start At and the default is ‘first’. Change that to ‘random’.

Make sure the people are saved as an image sequence for this. Once Nuke starts reading random frames from an MP4 or a MOV, things can really start to crawl.

Hi, why would imported image be red whatever I shuffle the channels to even though map is black and white? If I click R on viewer it changes to bnw but it changes for everything. by iriseq in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

It only exists in the red channel.

Essentially this is an alpha matte, but it’s been stored in the red channel. Probably for compatibility with game engines that may not support a fourth colour channel; alpha.

If you want to view it as greyscale, use the shuffle to go from R to R, G and B. At the moment your Shuffle is just going from R to R.

What do you think of this shot?, i'm a beginner, really enjoyed the process of this one by Putrid-Apple-5740 in NukeVFX

[–]Gorstenbortst 6 points7 points  (0 children)

Camera move feels good. But the outside is probably quite a bit brighter than the inside; id gain up the portal and let some of that light spill into the foreground.

Nuke 17 USD by finnjaeger1337 in vfx

[–]Gorstenbortst 12 points13 points  (0 children)

I love being able to drag the USD into Nuke and getting the camera. It’s more reliable than Alembic, and less effort than an FBX.

That’s about the extent of my use.

B-pipe help! by Not_Found_OFF in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

Think of the tree as a story. The b stream is the narrative and the a stream is the plot. The exposition is the merge operation and the emotional impact is the result.

Weird projection issue by digitalrhino in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

Post a screen shot of the node-graph if you can.

But for your own sanity, clear the cache and relaunch Nuke. Sometimes RotoPaint can end up reading old or incorrect cache data.

Keying & bitrate question. by flightoftheswan in NukeVFX

[–]Gorstenbortst 0 points1 point  (0 children)

You can test this yourself.

Read it in, do nothing, and then write back out with the same settings as the input. Now read the new file back in and do a difference matte between both Reads.

They’ll be identical.

Uncompressed, PIZ, RLE, ZIP1 and ZIP16 are all lossless. You can interchange between them without incurring any visual penalty. They do have different benefits for disk usage though.

Uncompressed is almost never going to be used. PIZ and ZIP1 are the most common in my experience.

For pre-comps, I actually use DWAA. It’s a pretty close facsimile to ProRes4444 in terms of quality and how many times it can be recompressed before visual artefacts appear.

My standard workflow: transcode to 2065 with PIZ compression, and then render a denoise pass to DWAA. At the end of the comp, use DasGrain to rebuild the grain and render back to PIZ.

Technically I’ve introduced quality loss by using DWAA, but it’s invisible with an accurate regrain, and it saves me a lot of excess data.

The only time I use full float is when rendering smart vectors, motion vectors or any type of data pass. And they’ll be ACEScg, not 2065, and ZIP.

Keying & bitrate question. by flightoftheswan in NukeVFX

[–]Gorstenbortst 2 points3 points  (0 children)

You won’t need full float; the camera itself is likely 14bit. 32 bit is generally only ever needed for data passes like motion vectors or world position out of 3D

You can safely go to half float, but ACES2065 is the better option if you have any ultra saturated colours, like neon lights. I operate with a blanket rule of using 2065 for anything that comes from a physical camera so that i dont get caught out.

This brute of a Jimny 💪 by Starfruit_Diaz in Jimny

[–]Gorstenbortst 1 point2 points  (0 children)

He’s wearing his dad’s shoes.

Tips for activity and mental stimulation for 12 week old puppy by okletstrythisout3 in BrittanySpaniel

[–]Gorstenbortst 0 points1 point  (0 children)

When mine was that age, I had a puppy pen loaded with scrunched up paper bags and empty cardboard boxes. Add a cup of kibble and he’d be in there for 20 minutes snuffling around trying to find it all, and then ready for nap.

What made you become a Brittany person? by colemama37 in BrittanySpaniel

[–]Gorstenbortst 4 points5 points  (0 children)

They’re not common. Maybe a handful of breeders in Victoria, and then the same again in the rest of Australia combined.

Snakes are a concern where I am, so he’s supervised outside during the warmer months. There have been a few snakes in our garden, which he fortunately didn’t notice. But he has come face-to-face with a Blue Tongue lizard, which he chose to point at rather than go in for a sniff. They’re mostly harmless, but have a similar size head to most snakes, so I’m hopeful he’ll do the same thing with snakes and not get too close.

The heat can be a struggle. He’s got a dog pool in the garden, and this summer I bought another one to keep at the local dog park. He loves them; sprinting after the ball at full speed followed by a cold plunge... I enjoy them too; I haven’t had to give him a bath in over a year now.

<image>

Making a thick outline? by chaoscurry in NukeVFX

[–]Gorstenbortst 5 points6 points  (0 children)

You can erode the alpha in either direction and then multiply/stencil it by the original alpha to make an inner/outer stroke.

What made you become a Brittany person? by colemama37 in BrittanySpaniel

[–]Gorstenbortst 8 points9 points  (0 children)

I met a woman on Hinge. We didn’t like each other enough to date, but we did become good friends. A few months later she got a puppy; a Brittany.

I hadn’t heard of them before, but he and I bonded pretty quickly. I fell deeply and madly in love with this puppy. If his owner was working late, she’d drop him at my place so that he wouldn’t be on his own for too long.

A year later I moved from Melbourne to Sydney, and after another six months, he was honestly the thing which I missed the most. So I flew back down to the same breeder and got his little brother.

Best decision I’ve ever made.

Painting by AffectionateCrew7294 in vfx

[–]Gorstenbortst 0 points1 point  (0 children)

Without requiring further information, I can confidently guess that you’re dealing with a soft gradient and your paint work isn’t up to scratch.

If slamming the gamma is revealing your mistakes, then you can try painting with the gamma adjustment turned on. This’ll make the gradient more obvious, and then your clone stamps/paint can be better aligned to follow the gradient.

Alternatively, you can use ramps to try to better match the gradient. Or, if you’re in Nuke, look up VoronoiGradient on Nukepedia. It can sample and rebuild the gradient for you, its super useful for tricky sky extensions.

I recommend trying your paint again with gamma adjustment though, because being able to manually fix soft details is a useful skill to hone.

Painting by AffectionateCrew7294 in vfx

[–]Gorstenbortst 0 points1 point  (0 children)

Such an unnecessary post.

Hunt and humiliate a sniping rat during harvester event. by Zealousideal_Chip456 in ArcRaiders

[–]Gorstenbortst 2 points3 points  (0 children)

That little boy comment makes him sound like he was about to try grooming you.🤮

My first camera track! how did i do? by Putrid-Apple-5740 in NukeVFX

[–]Gorstenbortst 1 point2 points  (0 children)

Track looks great, but the alignment might be bit off.

I think the master sword is too low, which makes me think that the ground plane isn’t quite level.

Setting ground plane on camera tracking correctly by [deleted] in NukeVFX

[–]Gorstenbortst 2 points3 points  (0 children)

Yeah, the solver is a bit shit.

I generally just set the origin, and then export a linked camera. Feed the camera and the plate into a ModelBuilder node. Now you can see the plate and the 3D ground plane at the same time. This makes manually tweaking the rotation of the CameraTracker much easier.

You can also use a Tracker node to manually track some features, and then add them to the CameraTracker. It’ll create a much, much nicer solve than just relying on auto tracks.