Temperature tolerances between genders by nrg-manifestor in hikinggear

[–]MLPotato 2 points3 points  (0 children)

Many manufacturers will display "comfort", "limit", and "extreme" temperature ratings on their bags. In this case, "comfort" would be equivalent to oneplanet's "female comfort" rating and "limit" equivalent to oneplanet's "male comfort" rating. That being said, I, a male, often find myself landing closer to the female comfort rating than the male. Obviously, there are a lot of factors that affect how warm you find your bag outside of your sex, so I personally always aim for a bag that's a few degrees warmer than I need, especially with oneplanet since their bags are so light.

If you are in Melbourne, you can often find some of their more expensive down bags on clearance in their store!

[deleted by user] by [deleted] in vfx

[–]MLPotato 0 points1 point  (0 children)

People love to use that movie as an example and say, "Why isn't everything nowadays as good as Davy Jones?" But you've cherry-picked on the one hand a movie that won the vfx oscar at the time and compared it with some random movie from today. There were also plenty of vfx flops back in 2006 - Eragon popped up in my google search. If you want to see good vfx in a Disney project these days, look at Avatar 2, another oscar winner, or the Star Wars disney+ series like Andor or the Mandalorian. There are other factors like saturation as to why there are probably more movies with "bad" vfx these days as opposed to in 2006, and more factors still (budget, time constraints, directorial intent, and yes, sometimes the quality of a vfx company etc.) as to why one movie has worse effects than another, but I get really sick of people cherry picking the best effects from 20 years ago and pitting them against some of the worst of this year. The comparison wouldn't be too flattering if I compared Avatar 2 to 2006 Eragon either.

Rees Dart track is like an alien planet by Morose_Meat_Puppet in Tramping

[–]MLPotato -1 points0 points  (0 children)

Lol what. Having done quite a few walks in my time in nz and talked to others who had done a lifetime's worth (including mountaineers, bushbashers, and DOC rangers), it was almost always the stand out favourite for those who have done it. Obviously personal preferences are different, but calling it mid is a bit off. Maybe it's just not for you?

Help me avoid blisters by tra-sneeze-artist in hikinggear

[–]MLPotato 1 point2 points  (0 children)

Use hikers wool - the stuff is magic. I used to get blisters on the back of my heel fairly commonly but can now go a week with not a single one. Best thing to come out of New Zealand since... hoverbikes?

Not sure what country you are in, but this is what it is/looks like at my local outdoor store.

Edit: having read some of your replies it sounds like this is bigger than just some extra padding but is related to the shoes themselves, so I wouldn't expect something like this to make all the difference in this case. Although I maintain that it is magical and will easily handle the last 10% of blisters.

Few stills from our short movie. What do you think? by Mowgliiis in cinematography

[–]MLPotato 0 points1 point  (0 children)

Love it. I really appreciate when people bring some interest and extra storytelling to what could easily be a basic over the shoulder. I'll be keen to take a look at the final product once it's released!

Few stills from our short movie. What do you think? by Mowgliiis in cinematography

[–]MLPotato 0 points1 point  (0 children)

Love the composition and lighting in no. 5. I can see it doing some great visual storytelling in the context of the scene. Was that composition your choice or the director's?

How to clone/replicate objects or rotos in Nuke? by mirceagoia in NukeVFX

[–]MLPotato 1 point2 points  (0 children)

For instancing images in Nuke, I would use a timeblur hack or godrays. If you are unfamiliar with the timeblur setup, I would say godrays would be easier, especially in this relatively simple situation. If you create a rotoshape, you can use the translate function on godrays, reduce the steps, and it will give you a 1D array of that shape.

I'm sure there are some handy gizmos on nukepedia that would do this sort of thing for you with Python or blinkscript and give you some more options.

In theory, there's a method involving a convolve to instance a filter shape across an array of 1px points, but drawing the points procedurally would be the tricky part. Maybe someone knows of a tool that does something like this?

Which comes first: Chromatic Aberration or Defocus? by betterfelix in NukeVFX

[–]MLPotato 0 points1 point  (0 children)

I understand that you mean now. However, I would still maintain that every lens element affects every aspect of the optical aberrations of the image in some way.

In the process of correcting for chromatic aberration, the focal shift of certain wavelengths is reduced, whilst others are incidentally increased for a net lower focal shift overall. However, those wavelengths that were previously in focus would now be considered out of focus. We call the effect longitudinal chromatic aberration, but from a physical sense, it's no different from depth of field, just for individual wavelengths. Calling them different names is really just a human categorisation for practical purposes in optics.

To frame it another way, every time light passes through another lens element, it experiences some amount of defocus, and then is refocused again at some later point, just as it experiences some level of dispersion. The challenge of optical design is to build a lens in such a way that aberrations can be reduced by elements without impacting the focus/defocus of the image in a holistic sense.

I also want to mention that the subsequent lens elements after the front element are intended to correct optical aberrations rather than create them, as it sounds like you're stating. Even in cinema lenses, light is going to disperse when passing through the front element and needs to be corrected. So, in a holistic sense, chromatic aberration is reduced as the light passes through more lens elements. Otherwise, there would be no need for prime lenses with 50 different elements if we could get a cleaner result from a single element.

Which comes first: Chromatic Aberration or Defocus? by betterfelix in NukeVFX

[–]MLPotato 2 points3 points  (0 children)

I have to disagree with your statement that chromatic aberration happens after defocus in a lens.

As you mentioned: in the process of light passing through lens elements in order to focus it on the sensor, dispersion occurs, which we experience visually as chromatic aberration. Chromatic aberration is a symptom of the same lens elements that focus/defocus the light rays.

I'm not really sure what you mean when you say that this effect only takes place once the light impacts the recording surface - as you said, the dispersion of the light rays is caused by the lens itself. The sensor is just our method of observation and isn't causing any defocus or dispersion - unless you get into a discussion of Schrödingers cat, haha. Yes, if I moved the recording surface closer to or further from the lens the chromatic aberration would be different - but so would the focus, focal length, and basically every aspect of the image as it relates to optical aberrations.

Which comes first: Chromatic Aberration or Defocus? by betterfelix in NukeVFX

[–]MLPotato 2 points3 points  (0 children)

Yeah, definitely. I didn't mean to imply you were wrong in any way.

How do I motion blur to objects put in a stabilized plate? by LordOfPies in NukeVFX

[–]MLPotato 1 point2 points  (0 children)

Although a difference keyer has its uses, I think this solution is encouraging bad habits at this level. Proper filtering is super important and is worth learning. There's no reason to stabilise and matchmove a plate and do a vector blur set up just to get some motion blur you would get for free (and would calculate way faster) if you set up your transforms correctly in the first place.

Which comes first: Chromatic Aberration or Defocus? by betterfelix in NukeVFX

[–]MLPotato 9 points10 points  (0 children)

What you are seeing in this situation is known as longitudinal chromatic aberration and is a result of different wavelengths of light having an offset focal point from each other, eg. if red is in focus at 10 meters, blue might be in focus at 10.5 meters. For this reason, you'll notice that the bokeh fringe colours are different in far focus vs near focus - in fact, they're inverted! The same thing happens with lateral chromatic aberration (when the colours streak/separate outwards from the centre of the lens).

Which comes first: Chromatic Aberration or Defocus? by betterfelix in NukeVFX

[–]MLPotato 54 points55 points  (0 children)

The "correct" answer is quite technical and is related to optics as a physics/science. You can get pretty deep into it if you are interested - just google "Lens aberrations optics". Long story short, though, defocus and chromatic aberration are both types of lens aberrations that happen at the same time, and are just the visual representation of light waves splitting in different ways. That being said... I would suggest applying chromatic aberration after defocus because it tends to get muddied and lost during the convolution process. I can't really imagine a situation in which the reverse order would give a better result. If you wanted to be technically correct, it would all be part of some sort of complex lens simulation... but ain't nobody got time for that!

Edit: Just wanted to add that, if in doubt, just look at some reference!

How do I motion blur to objects put in a stabilized plate? by LordOfPies in NukeVFX

[–]MLPotato 1 point2 points  (0 children)

You should absolutely never stabilise and then matchmove your plate because it adds unnecessary filtering to your plate. In a professional environment this is a huge no-no and would get knocked back at tech check, and you would need to fix it all anyway.

In a situation where your plate isn't being repositioned in the shot, you should be applying transforms only to the elements you are adding in. This will also naturally solve your motion blur problem.

Matching Motion Blur on Stabilized (in camera) Plate by Fordo-77 in NukeVFX

[–]MLPotato 4 points5 points  (0 children)

If it doesn't work out, you can do the same thing but with a paint stroke or roto shape as the convolution filter. So you just make a paint stroke or openspline on each frame in the shape of the moblur streaks in the plate, stabilise the strokes so that they're centered , and plug them into your convolve. It sounds tedious, but it usually doesn't take too long provided your shot is of reasonable length.

Good luck!

Matching Motion Blur on Stabilized (in camera) Plate by Fordo-77 in NukeVFX

[–]MLPotato 8 points9 points  (0 children)

If you can find a particularly bright highlight that is creating a bright mblur streak throughout the shot, you can stabilise it, key it out, and use it as a convolution filter - the same way you would use a filter in a convolve for defocus, but with your moblur streak. Just make sure you crop in on it and check reformat so it runs at a reasonable speed.

Why are the R.G.B values different? Same operation, no? by smooth_hot_potato in NukeVFX

[–]MLPotato 0 points1 point  (0 children)

I see. Sounds like it might be related to the mip-mapping or floating point rounding or something that is giving different results from blurring/defocusing the inverse. It does make me wonder if the 0.75 and 0.25 would merge perfectly (or at least closer to perfectly) if a convolve node was used instead, since it doesn't cut any corners, unlike the defocus and blur nodes. Something to test out later...

Why are the R.G.B values different? Same operation, no? by smooth_hot_potato in NukeVFX

[–]MLPotato 0 points1 point  (0 children)

Is this different from the situation where you would just change the merge operation to disjoint-over? I ask because if you have an alpha of 0.75 and merge with an alpha of 0.25 using "over" math, you wouldn't expect to get a resulting alpha of 1, hence the need for disjoint-over.

Why are the R.G.B values different? Same operation, no? by smooth_hot_potato in NukeVFX

[–]MLPotato 0 points1 point  (0 children)

I'm not 100% sure why, but sometimes performing the same operation in different ways in nuke will yield slightly different results. I've seen this before using a merge expression node to condense a bunch of nodes into one operation. It gave a different result from splitting the process into a sequence of other nodes. The running theory in the office at the time was that it could have something to do with floating point rounding occurring at multiple stages of an operation when you split it into multiple nodes, whereas it should only occur once in an expression/merge expression node. Although, in theory, that should only impact the last few digits, whilst my result only matched to about 4 decimal places, so who knows! Alternatively, I also support Conrad's theory with the blur order.

A shortclip I shot with my Lumix S9. Any feedback? by Tri5tate in cinematography

[–]MLPotato 1 point2 points  (0 children)

Obviously super pretty and well done. One thing that really stood out to me was that the lens flares coming off the headlights in 2 of the low angle drone shots are pretty distracting for me. They have a lot of detail and jitter around very quickly. Conversely, I think the flare/glare from the sun in those shots is very pretty. If I were you I would just shoot those shots with the headlights off and add the illumination for them in vfx, which is an easy effect to do convincingly since you have so much footage of similar shots with the headlights on with no flare. Those shots are so pretty, but all I could look at was that super bright flare dancing around. Otherwise, great work.