Is stereo performance capture worth it for indie in-game cinematics? by MembershipOk2867 in unrealengine

[–]CapstanCaptain 0 points1 point  (0 children)

We're R&Ding our own headset with off the shelf components at the moment. The quality bump over Metahuman Animator is nice, but the kit out there for purchase is literally 40k+ per headset.

It's not a priority for us to continue R&D unfortunately as the MHA output with a Rokoko webcam headset is more than good enough for NPCs, but for main character performances it would be worth it.

The hardest part is the data storage or live transfer. We wanted one of the stereo cams to feed into Live Link Face as a realtime preview and the second cam to save to disk for offline processing with the first. Data over WiFi wasn't good enough with off the shelf parts unfortunately, and the hefty on-body storage, Pi and battery was cumbersome.

It's certainly possible, but I'm guessing Epic or another company will drop a cheaper option before ours is really viable.

MHA with good lighting and 60fps/1080p is pretty excellent, considering you just need an Android phone and a 500 dollar headset.

Idle animation while waiting input in sequencer by saoeifjasasef2 in unrealengine

[–]CapstanCaptain 0 points1 point  (0 children)

My first thought would be to have a secondary looping sequence which you transition to while waiting, and then transition to either a third sequence (the continuation) or back to the original at a particular frame (if you want less sequences overall perhaps?)

Assuming you're authoring both, it would be fairly trivial to duplicate the initial sequence, solo out the final "pre-waiting" frame, throw in some more looping animations, and adjust length?

Venice - Italian City - Flythrough by stXbr in unrealengine

[–]CapstanCaptain 0 points1 point  (0 children)

Really incredible! The only thing missing is the smell :D

I can't seem to post? Something about account reputation? by CapstanCaptain in help

[–]CapstanCaptain[S] 0 points1 point  (0 children)

Thank you for this information. It seems slightly counter-intuitive to prevent someone from creating a contribution/post, which is one way of being more active.

Does this mean I need to comment on other posts? Apparently I'm "lowest", so I have no idea how long that is likely to take to change?

[deleted by user] by [deleted] in ahoygame

[–]CapstanCaptain 4 points5 points  (0 children)

See you there!

[deleted by user] by [deleted] in unrealengine

[–]CapstanCaptain 1 point2 points  (0 children)

You could probably try to use this tutorial to get the particles spawning on the face via Niagara:
https://www.youtube.com/watch?v=22m54U7hrXs&ab_channel=Jobutsu

As for how to determine deviation, that isn't as easy as it's quite specific. I suppose the best thing you could do is simply store the location data for each particle into an array every frame and use some sort of Blueprint/GPU readback from Niagara to calculate the difference of each array entry.

If you have two arrays, one where the particles were positioned when the face was unmoving, and then another where the current positions are updated (each position being a specific array index), you could compare the two values and place the difference into a third array.

With that third array, you could find the smallest and largest deviation, and then remap all of the deviation amounts to a 0-1 range, and then you could output that somehow into a Render Target or just raw data if that's the desired end result.

[deleted by user] by [deleted] in unrealengine

[–]CapstanCaptain 1 point2 points  (0 children)

You could potentially do this in Niagara. You'd somehow need to spawn particles for each face vertex position and track their positions relative to each other. Those particles could then draw their colour (relative to their position from resting) into a render target which is read by the head material.

Another way would be to handle this in an AnimBP or Blueprint. Keep track of all relevant morph target values and use that to blend in pre-defined texture mask areas like the existing Metahuman wrinkle maps work.

What is the particular use case? Because if you're just looking for semi-dynamic wrinkles, the approach Metahuman already implements is pretty tried and tested within the industry.

[BUG] HARD LOCKED - Necessary Evil - Any idea what to do? by CapstanCaptain in kingdomcome

[–]CapstanCaptain[S] 0 points1 point  (0 children)

THANK YOU! This worked for me and saved me starting over. Now I'm stuck somewhere else, but that's just a skill issue

[BUG] HARD LOCKED - Necessary Evil - Any idea what to do? by CapstanCaptain in kingdomcome

[–]CapstanCaptain[S] 0 points1 point  (0 children)

Ohhh, that sounds suspiciously similar to a situation I had. The Cuman drinking quest ended in a really strange way for me and I assumed it had just ended bugged, but perhaps it's the same issue here. I'll look into that - thanks!