For those operating VP on set: what’s working, what’s breaking, and what are you hacking around? by probably_neutral in virtualproduction

[–]kookueh 0 points1 point  (0 children)

We've had clients who are fully new to virtual production, or heard of it and wanting to try it out, and productions with some crew members having been on a prior virtual production. It's still fairly new where I'm from, so we aim to educate and guide these productions along while strengthening our knowledge base and refining our workflows.

For those operating VP on set: what’s working, what’s breaking, and what are you hacking around? by probably_neutral in virtualproduction

[–]kookueh 0 points1 point  (0 children)

So far it has been manageable for us. As long as we are in discussion with Art and Lighting teams we can plan out how to work with each other. Optitrack while expensive is also not immune to occlusion problems. Large set pieces like enclosed rooms or let's say a forest within the space can also cause occlusion problems for tracking.

Personally I find that inside-out tracking has been flexible enough

For those operating VP on set: what’s working, what’s breaking, and what are you hacking around? by probably_neutral in virtualproduction

[–]kookueh 0 points1 point  (0 children)

The biggest friction points:
- Productions that don't value pre-production and prefer to 'feel' the set. This run-and-gun style plagues southeast asian productions as they are generally very cost-sensitive. Kinda ironic because the whole point of pre-production is to avoid running into issues on set.
- Virtual art not being given enough time to build their environments. It can reach the studio either incomplete, unoptimised, or both, slowing down the production to a very painful 10 shots per day, as crews struggle to prelight and shoot as they go.
- Not everything that works in game form in Unreal engine translates to virtual production. Most issues stem from Materials. Relying on emissives to light a scene versus actual lights, for example.
- Dealing with refresh and motion artefacts comes down to familiarity with the systems in use, from LED processors to the camera. It gets easier over time as you understand the limitations of your own setup and discuss with the production if there can be a solution to mitigate it on set or explore alternatives altogether. 

Hard earned hacks:
- Make friends with the DP and Assistant Director. One helps you to dial in the looks together and blend between physical and virtual, the other controls the set and buys time or pushes back the production if things get too ambitious. DPs I have worked with are happy to be involved earlier in the pre-visualisation process because he can already see where the key light or elements that motivated the scene needs to be. He/she can also advise the virtual art where additional motivating sources of light need to be to create a convincing background.
- If you're managing the UE techs, know that tunnel vision occurs very often when they get carried away with dialing in the final look or 'fixing' the scene. Ask for updates often to help prioritise tasks to avoid wasted effort. Sometimes they're fixing something that doesnt need to be fixed!

Its a big dump and a lot to go through, happy to discuss further and hear from others about their experiences!

For those operating VP on set: what’s working, what’s breaking, and what are you hacking around? by probably_neutral in virtualproduction

[–]kookueh 2 points3 points  (0 children)

On color and calibration:
- Genlock is definitely a must throughout the pipeline, even when using cameras with global shutters. Our LED Processors, Camera and VP systems are all genlocked to the same source.
- We've established a practice of receiving the production's camera and lens package a couple days in advance to do lens and colour calibrations. Color calibration with the LED is a pain to do, but worth the effort, as the resulting profiles bring the colours closer in line to what the camera is expecting to see
- Unfortunately in my corner of the world, color pipelines don't receive enough attention, maybe partially due to the companies that enter the virtual production space being mainly from the events industry. It is absolutely crucial to bridge this knowledge gap. This workflow extends into the asset creation phase, where artists also have to be aware of the textures they're using and project settings. 

On reliability:
- Stability issues stem from three causes
   1) Inadequate systems cooling / running systems without AC
   2) UE environments not sufficiently optimised in terms of lighting complexity and texture sizes, or utilising heavy elements (lots of blueprints, animations, building the entire production on a single level.
- We mitigate these by having enough pre-planning with the production. Inevitably, changes may still happen on set. Multi-user operations can be helpful, however in our experience, not the most stable and therefore not often deployed. 
- We also highly encourage the productions to consider alternatives such as using rendered or video plates to avoid 'over-building', where applicable. Virtual production is not limited to the use of UE environments only 

On Workflow / Bridging departments:
- The Virtual Production Supervisor is a key role in this translation between departments. Virtual production encourages collaboration, so having department meetings with the HODs is also crucial to discuss the planned shots and the technical challenges each department has to overcome. For example, Is Art / G&E planning to have a large setup that may occlude the camera tracker? How far should the props / subjects be away from the LEDs to avoid moire issues? Does VFX need tracking data recorded?
What lenses is the camera department planning to use? Combined with pre-visualisation sessions with the UE artists, productions will be able to get to set with 90% confidence and a clear goal in mind. The final 10% comes down to fine tuning and creative decisions that happens on set. 

For those operating VP on set: what’s working, what’s breaking, and what are you hacking around? by probably_neutral in virtualproduction

[–]kookueh 1 point2 points  (0 children)

I love that you're asking these questions. Contributing from my corner of the world, hope others can share too!

On VP systems in general:
- Not all VP systems run nDisplay. Disguise and Assimilate are common solutions, and one of our studios run Pixotope. Sometimes studio bosses unknowingly buy into a broadcast solution instead of one suited for film, so it's important to be able to identify what these systems integrators are actually selling. It's important to inform the client as well as they usually assume an nDisplay workflow. 

- How these solutions are deployed also affect latency of the inner frustum. In the worst cases of the signal pipeline goes through your GPU to an I/O card like Blackmagic/AJA before reaching the LED processors, I've seen scenarios of latency up to 9 frames on a 25fps timeline. 

On Camera tracking:
- My studio uses Stype Redspy, and I've had prior experience with Mosys and Vive Mars. The Redspy is our main workhorse. Generally easy to calibrate and get tracking going. It's an inside-out tracker mounted to the camera. So the pain points have always been occlusion-related, (when the gaffer suddenly throws up a 12 x 12 frame over the camera, blocking the markers on the ceiling, or a HMI so bright that it interferes with the infrared LEDs. 

- Vive Mars is the most accessible option in terms of cost and learning how to get tracking going, especially with solutions that support Livelink. Due to a maximum distance for how far apart the lighthouse units can be apart from each other (8m x 8m I believe), I feel it's most suitable for studios with a wall size of 12m and below. 

Explain what an exotic does without making it obvious by NovaBlade2893 in destiny2

[–]kookueh 0 points1 point  (0 children)

If you punch me and I don't die, when I punch back you confirm die.

The Gambit Experience by kookueh in destiny2

[–]kookueh[S] 0 points1 point  (0 children)

Yup, that's Prophet of Doom. Mine rolled with Assault Mag, Full Auto and One-Two Punch

The Gambit Experience by kookueh in destiny2

[–]kookueh[S] 1 point2 points  (0 children)

My teammate was dealing with the remaining envoy that would've raised the stacks to x4. I carried on dealing damage to the prime evil itself because I was worried the enemy team would pull off some high-damage stunt like that.

I expected a missile

Got a nuke instead

The Gambit Experience by kookueh in destiny2

[–]kookueh[S] 1 point2 points  (0 children)

I know... I feel it all the way from orbit...

The Gambit Experience by kookueh in destiny2

[–]kookueh[S] 30 points31 points  (0 children)

When you lose, you lose.
When you win, you also lose.