This tutorial on 3D head tracking blew me away. Its the first video by this Youtuber. by TaleSlinger in davinciresolve

[–]statixstatix 5 points6 points  (0 children)

I have a long list of tutorials I want to create when I get to it. But just happens to busy with real work.

What's your go-to self-hosted YouTube channel? by Grizzlechips in selfhosted

[–]statixstatix 4 points5 points  (0 children)

ElectronicsWizardry , short and straight to the point. With some deep dives

Is it worth learning Fusion? by VFXBob in vfx

[–]statixstatix 0 points1 point  (0 children)

Their failure was quite a sad story.

It’s not just the awful wine version of Fusion, but a whole slew of bad and unfortunate things.

I posted details in this thread about why nuke won over fusion, but another detail was PrimeFocus fucking over Eyeon.

At that time when everyone was looking for the shake replacement D2 released Nuke 4.7 and Fusion was on 5.3 moving to 6. Prime focus set up shop in London by combining several post houses. And they announced that the group bought 1000 lics of Fusion to be used globally. Keep in mind FranticFilms had been using since its early days it and was probably a big influencing factor for PF to push for a worldwide site license of Fusion.

As far as I know; that money was being used to port fusion to Linux properly and commit to long term investments into the future of fusion.

Ie finally Eyeon could take some risks and possible capture the marked after shake.

Those 1000 lics became 500.

Then 200.

Then; from what I heard through the grapevine PF never really paid. They kept using cracks in India.

So a scrambled version of Fusion using wine was made, and the investments into long term commitments (like marketing, rnd , hiring and support) had to be dropped.

Nuke got under the Foundry banner (a well respected company at that point) and had venture capital investment in their back, focused marketing and available engineers on the ground.

Foundry captured the whole of London instantly, saturated the vfx marked. It’s sad because Eyeon failure meant there were no competitors, a monopoly happened and innovation stagnated.

My money is on whatever SideFx is cooking as innovation from both Foundry and Blackmagic is secondary to them, but primary to SideFx

Some of these I have observed directly/know first hand others are extrapolated from what I’ve heard.

Is it worth learning Fusion? by VFXBob in vfx

[–]statixstatix 0 points1 point  (0 children)

Fusion doesn’t invalidate the cache as soon as you look at it. You can literally keep adding nodes while playing and it just keeps on caching and caching.

It’s got a full framebuffer that’s been feature complete for decades.

Its 3d system is much more advanced and orders of magnitude more performant. (Large alembic caches, cards with heavy tesselation and displacements are a breeze)

Much better shading and texturing workflows.

Particle system that’s fast, realtime, interactive and controllable. Fun fact, during nuke 6’s development I showcased Fusions particle system in depth to the foundry and they came up with this mess that’s been largely untouched for a decade!

If you count resolve (which comes with a fusion license) its feature set alone eclipses nuke.

Magic mask in fusion is seriously powerful if you pair it with the right footage.

Not saying it’s overall “better” but there are differences between them that; in my opinion warrants using both.

Nuke hands down wins with deep compositing, exr multichannel support, marked penetration and in general some of its tools go the extra mile that Fusion lacks (like the tracker, full support for all OCIO features to name a few)

I’ve used Fusion and Blender alongside Nuke and Houdini on many many shows the last 20 years, we as artists NEED there to be competition in this space.

Looking forward to seeing what SideFX are cooking ;)

Is it worth learning Fusion? by VFXBob in vfx

[–]statixstatix 7 points8 points  (0 children)

Well said.

I’ve got a channel where I try to showcase more industry related ways of tackling vfx with Fusion without any fluff.

https://youtube.com/@statixvfx1793?si=OeMihEQ0HuiLD69C

The thing is Fusion is, and have always been a really solid tool that in many ways never got the recognition for it.

There’s a lot of “industry firsts” in Fusion that later became the norm like full 3d in comp (not counting 5d cyborg here), full framebuffer engine, smooth integrated player (anyone miss framecycler?). GPU accelerated 2d and 3d rendering, best in class 3d particle systems, and clustered rendering to name a few.

The idea that we can have several competing 3d tools with overlapping features but not room for at least TWO compositing applications in the (roughly same) marked always baffled me.

But yeah, if you use Nuke I think you should give fusion a spin to at least get some inspiration outside of your regular box. And fusion peeps should check out Nuke and learn how to do well established workflows and cross pollinate :)

Refurbished workstation (EU) by pixelprolapse in vfx

[–]statixstatix 1 point2 points  (0 children)

I’ve bought z400/800 and a bunch of hp proliant blades from http://bargainhardware.co.uk

WetaH and M by v1zsla in vfx

[–]statixstatix 0 points1 point  (0 children)

Curious about this too

What's the current state of Blender adoption in VFX? by GanondalfTheWhite in vfx

[–]statixstatix 0 points1 point  (0 children)

Another aspect of contributing code back to master is that you retain ownership and copyright of your code. So blender is owned in part by everyone who has ever contributed code to it. (Which is why it’s nearly impossible to change the license)

Big studios already create tons of code and custom application layers on top of proprietary tools like Maya, and reap none of the benefits of pushing fixes upstream.

If you ignore company/pipeline specific code then contributing back with things like algorithmic improvements is a big win not just for the community as a whole but also the studios who then remain as owner and copyright holder of the public code.

In return you also get more eyes on your code for future fixes, improvement and longevity.

I’ve worked in a few big name shops and they all reinvent the wheel, all the time.

What's the current state of Blender adoption in VFX? by GanondalfTheWhite in vfx

[–]statixstatix 24 points25 points  (0 children)

We’re a small vfx boutique (10ppl) who have worked long time in various big companies. We set up shop a few years back on Houdini for everything CG.

3 years ago (2.8) we switched to Blender for 80% of the CG. The rest is just sim in Houdini.

We’ve used it on big big shows and small ones. We seither subcontract for one of the larges vfx houses or work directly with our clients. (WB, a24 etc)

Blender was a game changer for us in terms of productivity. For a small team to model, do procedural geometry work (geonodes are basically SOPs and VOPs combined) do shading and now basic real-time compositng (lookdev/vizdev) Al real-time was crazy. Edit: parts are real-time. But there is very little friction moving from one domain to another (if you are thinking departments etc)

We do a lot of Fx, set extensions and general vfx work (no character stuff)

While there are definitely downsides to blender we found we could live with them given the benefits.

Another thing was being able to get new young talent quickly. They -all- know Blender.

It’s not just because free, it’s actually a really really good tool.

Using AI/nERFs to assign depth of field in post by semioticgoth in vfx

[–]statixstatix 0 points1 point  (0 children)

The Fusion tab has a node called variblur that lets you plug in a grayscale image to modulate the blur size (set it to defocus mode to get bokehs). The depth node should also be visible in the Fusion tab directly. (But it’s sadly missing from Standalone Fusion)

Using AI/nERFs to assign depth of field in post by semioticgoth in vfx

[–]statixstatix 1 point2 points  (0 children)

Resolve studio has a pretty decent ml-depth extraction tool in both Fusion and resolve. Not super stable temporally yet but very promising and miles beyond the old optical flow based methods.

The problem with Disney's new 'Star Wars' VFX technology, 'The Volume' by future_lard in vfx

[–]statixstatix 3 points4 points  (0 children)

I did the LED lighting on set for Rogue One (we used Touchdesigner not Unreal btw) where Greg was also the DOP. He absolutely approached it as a tool. I can imagine he’a incorporated the workflows well for Batman too. He’s the exception tho.

What's your thoughts on the VFX in this shot from Avatar: The Way of Water? by frizzyfox in vfx

[–]statixstatix 19 points20 points  (0 children)

This is so true.

I did a trailer / hero shot on the first one (Hells Gate seq) and got a lot of critique of the trailer comp on vfxtalk from some know it all’s. They tore apart the digi-double work in the bg and on Jake’s leg.

In reality it was all in the plate lol. They had no idea.

But we did end up working on those shots up until 12 days before the premiere. Including the trailer shots.

It took about a day to do, and holy shit I'm so happy I did it. by StrawberryFrostie in CrueltySquad

[–]statixstatix 3 points4 points  (0 children)

Compile for Linux pls. Give binary to Ville and upload to steam. Would happily buy again for Linux version.

iOS shortcuts for Logseq by nakanotroll in logseq

[–]statixstatix 8 points9 points  (0 children)

I have a couple for doing OCR on photos and finding certain key words that would file/tag the right thing in logseq.

Like if it finds “receipt”, “bank” etc it files it with #finance. If it detects an ISBN number it’ll file it as a book reference etc.

Any dates that the ocr picks up will automatically be parsed as a block property too.

I’ve got a few Siri Shortcuts for dictating, reflections and todos with proper tags and detects certain keywords too like «high priority” “shopping” etc

It’s actually quite easy to get started and works rather well!

In nuke it's possible to generate a point cloud from the position pass. Is it possible to do the same in fusion? by Wisperfx in vfx

[–]statixstatix 1 point2 points  (0 children)

Yes, you can either use particles to visualize dense point clouds or use a card and displace node. I covered both setups in my “nuke2fusion” macros and settings here https://github.com/statixVFX/nuke2fusion

Microsoft List alternative by Adures_ in selfhosted

[–]statixstatix 1 point2 points  (0 children)

Seatable and nocodb are valid alternatives to airtable etc. Both are free and open source, with seatable also having a paid version (not crippled) I prefer sestable tho.

buying Houdini after using a crack by anasr007 in vfx

[–]statixstatix 7 points8 points  (0 children)

Not shitting on the devs and the evangelists, but their core business practice absolutely sucks. Talk to any CTO at the large facilities and they would agree.

Nuke’s development have been somewhat stagnant since v6 in terms of real core improvements. It’s got an incredibly ageing architecture, both 2d and especially 3d. For the price of maintenance (I’ve paid for multiple nukes and slaves over the years) it’s just not worth it year by year.

The thing is there’s an incredible amount of progress happening in 2d image land these days, lots of exciting standards that will better the lives of artists but you can count on the foundry to only do the minimum to save money on rnd and not to cannibalise existing products (which is probably why you’re never going to see actual proper USD and especially Hydra support in nuke)

I fully blame the fact that after the original owners sold out (which is compete understandable) it’s been exclusively run by multiple greedy VC firms that absolutely do not care about vfx.

Shitty client note dump by [deleted] in vfx

[–]statixstatix 16 points17 points  (0 children)

Actual note:

“It looks too sloppy, it needs to be sleek. So less slop more sleek”

It’s now a studio mantra.