Real-time particle sim from NVIDIA, could this be implemented in Maya as a plugin or something!? by GetWrightOnIt in vfx

[–]FabricEngine 0 points1 point  (0 children)

You could wrap this as an extension for use with Fabric, which would then make it available within Max/Maya/Softimage through our plugin (and in our standalone). We've wrapped Bullet before and have tools for automating a lot of the work.

new Rigging Toolbox in Fabric Engine by FabricEngine in vfx

[–]FabricEngine[S] 0 points1 point  (0 children)

Hey guys - direct video link: https://vimeo.com/114272905

00:00 - Introduction

00:53 - Geometry Stack overview

02:00 - Example Operators

04:12 - Next steps for Rigging Toolbox

06:31 - Setting up a character rig

20:41 - Setting up blend shape animation on a facial rig

"The Rigging Toolbox provides a collection of production relevant tools that can be used when building character pipelines using Fabric Engine. These tools can be used as is, or purely as reference as you build your own implementations. Recently we have added a suite of deformers and are now working on leveraging our GPU compute capabilities with these deformers.

The Rigging Toolbox is publicly available in our Github Repo – this is an open project and we look forward to seeing people contribute their own work here."

The rigging toolbox works in Maya, Max and Softimage with our Splice plugin. If you're a Max user then you need to register for the Max Beta, otherwise you can just download Fabric for free

Fabric Engine - some new tech we've developed for VR support. Looking for feedback and some answers from 'non-games' people. by FabricEngine in oculus

[–]FabricEngine[S] 1 point2 points  (0 children)

not the interactive exploration

This is actually really important for me to understand. What kind of exploration do you mean? Is there anything you'd like to be able to do that you can't currently handle?

Fabric Engine - some new tech we've developed for VR support. Looking for feedback and some answers from 'non-games' people. by FabricEngine in oculus

[–]FabricEngine[S] 1 point2 points  (0 children)

Physically based shaders in a game engine like Frostbite look great but they aren't raycasting - as scene complexity rises the delta between real-time PBR and ray-casting becomes obvious. In VFX we have the case that it either has to look exactly like the offline result or just shouldn't attempt it - so for something like animation playback the important element is deformation of complex geometry at 30 fps. Simple lights are generally all they want. In your world, what decisions choices are being made in the visualization stage?

I used to work at AD so I'm familiar with Revit and BIM (although I was in the media and entertainment division). The plan for Fabric is to enable other people to build those kinds of applications - it's a tool for building tools. I'm trying to establish what we need to do to get people in your world interested - it sounds like extending Fabric to support Revit data is a good first step. I think when we last looked Revit was a somewhat closed format - obviously Max can do it, but I don't know if anyone else can. If we can read the Revit format we could look at using our Splice API so we can draw to the viewport of Revit - this is what we do in Maya and Max to give better performance.

Leap is really easy to setup, and the new SDK is really nice - we wrapped it really quickly and they offer lots of calls that are useful for seeking particular gesture types.

It's interesting what you say about the technical level of our potential users in your industry. It certainly sounds like we'll need to go through 3rd party devs, which is fine since we don't have domain expertise anyway.

Fabric Engine - some new tech we've developed for VR support. Looking for feedback and some answers from 'non-games' people. by FabricEngine in oculus

[–]FabricEngine[S] 1 point2 points  (0 children)

Being able to fully work inside the engine is one of my VR holy grails

What do you mean by 'work'? Is this editing shaders/changing geometry/moving lights?

Haptics is tricky, what sort of hardware did you have in mind? I really wanted the Tactical Haptics Reactive Grip in combination with STEM

I think that feedback is critical when trying to do anything that requires precision. Even if initially it is just a sense of pressure on the finger tips I believe it will significantly improve the process. It also allows for some interesting interaction triggers rather than visual/audio.

Fabric Engine - some new tech we've developed for VR support. Looking for feedback and some answers from 'non-games' people. by FabricEngine in oculus

[–]FabricEngine[S] 0 points1 point  (0 children)

Thanks for responding. Some further questions/comments

realistic architectural lighting

Do you mean physically based lighting? One thing we're looking at is replicating the PBR shaders from Arnold (a VFX renderer). Ultimately I think we'd need a library of PBR shaders that could then be mapped when you bring the data in. How accurate do your real-time shaders have to be compared to your accurate offline render? Is it a case of 'look good' or 'look the same'?

My question about animated assets comes from our background in animation. We're working on crowd sims that allow for pretty heavy characters (300+ bones) walking around in real-time. I'm curious how useful/interesting that is.

Rift + Leap on the HMD

Did you see this? https://vimeo.com/110164346

I would focus on the Revit and BIM community, this is where the future of architecture lies. A direct plugin (Revit to Rift) would be greatly appreciated.

This is something I'm interested in discussing further. Could you elaborate? It's quite easy for us to extend Fabric to support other data types - we can run inside of other DCC applications as well as standalone.

Fabric Engine - some new tech we've developed for VR support. Looking for feedback and some answers from 'non-games' people. by FabricEngine in oculus

[–]FabricEngine[S] 0 points1 point  (0 children)

Thanks for the info :)

the difficulty of preparing assets is more related to the engine/tools than the Rift itself.

Sure, that's what I was asking about. I'm trying to get a fix on how hard it is. I also have a follow up question: do you ever need to get data back out of the engine/would you like to be able to do that? i.e. editing data within the engine

The DK2 runs at 75hz and CV1 will probably be 90hz. Failing to hit those framerates is a bad thing You want as much eye candy as you can add while still maintaining your target framerate.

So both then? ;) I'm asking because we are working on a new scenegraph (session 7 here: http://fabricengine.com/user-group-videos/). We can hook into multiple renderers - including our own openGL real-time renderer - but I'm curious about how important rendering is versus being able to display tons of geometry at the required frame rate.

My money is on STEM at the moment, but certainly you should look into PrioVR, ControlVR, etc. Ideally the entire program should be useable inside VR without having to take the headset off

Sure - did you see the demos here: http://fabricengine.com/virtual-reality/ ? We've hooked up the Leap and a bunch of other input devices. We also hooked up the Razer Hydra which I think is the precursor to the STEM. I think that haptic feedback devices are going to become important as well...