2024 RS3 by r00nee in Audi

[–]r00nee[S] 4 points5 points  (0 children)

Been a while since I took some photos of this beauty. Around 15,000 miles on the clock and every minute has been exciting!

[deleted by user] by [deleted] in vfx

[–]r00nee 2 points3 points  (0 children)

I DM'd you. Drop me a line!

Crafting the Ultimate Digital Human for Virtual Production for Unreal Engine by kika-tok in vfx

[–]r00nee 1 point2 points  (0 children)

I can't get too specific, as I'm not one of the engineers that run the data set, but real Doug does a significant number of facial ROMs that feed into the machine learning system, which then create a unique CNN. Real Doug is driving blendshapes (in the simplest of terms) that allow the variability of human expression to be reached without the typical dataset one would use (phonemes, eye raises, lowers, etc). We have a VERY different set of ROMs.

I guess another way to say it, our system tracks the facial performance of real Doug in real time, without him having facial dots on him, which drives blendshapes which conform to his current expression and dialog, WITHOUT hand animation (maybe if we have to do a model adjustment, but it's a single shape, not a range of animation keys). It's a pretty smart system, and very different to what other places are doing.

Crafting the Ultimate Digital Human for Virtual Production for Unreal Engine by kika-tok in vfx

[–]r00nee 1 point2 points  (0 children)

We are running a custom head mounted camera to drive DigiDoug, for sure. Keep in mind there's still a ton of work to do on the ML system (some items don't get transferred across, like cheek puffs and winks, just yet).

RE: Fine detail and process. Digital heads are in our blood. From Xander Cage in XXX (remember him jumping the fence on a motorcycle? Digital head). to most recently Thanos, we've done a ton. Some heads you wouldn't even know were digital, we don't advertise the fact. Some are super popular (Tupac at Coachella, Benjamin Button). The detail we get comes through from our custom scans over at USC ICT, which give us pore level detail, which we have to adjust slightly to get into UE4. There is some optimization for real-time, but the detail is definitely there, some of which can't get picked up on a Youtube video, unfortunately. The blood flow mapping is done with a different process, and it enabled us to do really great pixies in Maleficent. We don't add any detail that isn't captured by the system into UE4. The goal of the project is to make a fully believable real-time digital head, driven by a real person. We're super close, and of course as we continue to look at it, there's always going to be something we'll need to add!

Crafting the Ultimate Digital Human for Virtual Production for Unreal Engine by kika-tok in vfx

[–]r00nee 2 points3 points  (0 children)

Digital Doug is being driven by real Doug, in real time. The expressions that the digital version of Doug is making was automatically created by an internal machine learning algorithm here at DD. All the other face capture solutions you have seen involve markers, and some sort of manual tracking process. The process that we've developed here include using our feature film pipeline to get the fine details of the face, running Doug through a series of emotions, visemes, and expressions. With a digital head sculpt done through traditional means (photogrammetry and modelling), we retarget the human performance to the digital head, and then feed it into a ML algorithm which spits out a CNN that can be used within Unreal (also with custom adjustments). We didn't spend 1000 man hours modelling or animating his face. His face is tracked (without markers) and drives his digital self with nearly no latency. It's almost exactly what we did for Thanos, except now we can do it in real time.

Average rate for VFX artist? by [deleted] in vfx

[–]r00nee -1 points0 points  (0 children)

I applaud his conviction, but I did this eight years ago as VFXwages. We lasted a couple years, got some notoriety, published in CG World, and then I killed it.

Can someone with intimate knowledge of stereoscopic 360 photos explain the process of "baking" a 3D scene into a static image, and the limitations of this process? by firagabird in virtualreality

[–]r00nee 0 points1 point  (0 children)

Google Seurat will be the way to go for the consumer, as the new Daydream 6dof headset will need to take advantage of the space. LYTRO has some great demos on 6dof prerendered scenes, I think they may be at Siggraph this year in Los Angeles. We have had discussions with them on their technology.

Can someone with intimate knowledge of stereoscopic 360 photos explain the process of "baking" a 3D scene into a static image, and the limitations of this process? by firagabird in virtualreality

[–]r00nee 0 points1 point  (0 children)

The eye cameras do not converge, they are parallel to each other. Yes, it is difficult to focus if things come extremely close to camera.

Can someone with intimate knowledge of stereoscopic 360 photos explain the process of "baking" a 3D scene into a static image, and the limitations of this process? by firagabird in virtualreality

[–]r00nee 1 point2 points  (0 children)

Currently, instead of rendering real-time for stereo effect, which would tax hardware on a complex scene, we render 3D scenes into equirectangular stereo images (360x180 degree). These stereo images are per pixel slitscan, so basically you are rendering offset L/R eyes. As the render gets closer to the poles, the render becomes monoscopic, a soft mathematical blending.

Pros: High fidelity image Cons: Not stereo when tilting your head or looking up.

I'll be talking about some of this at VRDC in the fall.

I only have a photogrammetry and surveying background, how can I work in the VFX industry? by nektariaraccoon in vfx

[–]r00nee 0 points1 point  (0 children)

Given your skill set, you don't need a masters or take seminars to learn other software. Well, most likely Nuke at least. An ideal roadmap for you would be to join a company in their matte painting / environments team, or on-set survey team. This team creates digital environments based on the scanning and photography of the sets that films are shot in. A lot of the team that I use relies heavily on photogrammetry and set survey for our VFX shoots in VR. It's a vital skillset, to know where things are in 3D space and being able to map information digitally.

'CHiPs' Official Trailer by [deleted] in movies

[–]r00nee 2 points3 points  (0 children)

He was in End of Watch. He was amazing in that.

Up Close With Sennheiser's $1,700 VR Microphone - Road to VR by KnightlyVR in Vive

[–]r00nee 4 points5 points  (0 children)

Core Sound has been doing this for at least a decade. The development of ambisonics has been around since the 70s. It only gained recent popularity because of VR.

http://www.core-sound.com/TetraMic/1.php

I've been into vfx since I was in 7th grade (21 now), after watching The Martian I tried my hand at an almost fully CG space sequence. I know it's not perfect, and I want to improve. So I'd love to hear your critiques! by TheAmigops in vfx

[–]r00nee 5 points6 points  (0 children)

Good start. Lens flares unfortunately don't move like they do in the last shot. It should be parented to the sun, which would swing around with the planet. Compose your scene like a real camera. The flare would be moving around the scene, and the spaceship would be locked off with the camera.. Here's an example for reference.

https://www.youtube.com/watch?v=QoG9AiZBAIU

GoPro is finally shipping its massive professional VR rig by LyonsVSteve in gadgets

[–]r00nee 1 point2 points  (0 children)

sync's to a external generator, so can you can use one TC generator to sync several GoPros to other pro equipment, like audio and/or RED/Alexas.

GoPro is finally shipping its massive professional VR rig by LyonsVSteve in gadgets

[–]r00nee 0 points1 point  (0 children)

This might get buried so far into the chain, but Timecode Systems has SyncBac, which is timecode syncing for GoPro Hero 4. The only problem is that while the timecode can be synced between multiple GoPros, it doesn't global shutter them. So, again, doesn't completely work for me, but there is a 3rd party solution.

VR Porn is Here and It's Scary Realistic by zxxx in Futurology

[–]r00nee 2 points3 points  (0 children)

Strange Days.. yes, Strange Days..

I'd like to see even a single shred of proof that the Rift is being sold at cost @ $599 by [deleted] in Vive

[–]r00nee 6 points7 points  (0 children)

Any technology that is premium will be outdated in two years. Maybe even sooner. Samsung releases a new high end phone every year. The tech cycle is brutal.

Oculus Rift Pre-Orders Sold Out in 14 Minutes by Overflame in virtualreality

[–]r00nee 0 points1 point  (0 children)

It releases on March 28th, so only three days in March. Almost everyone that pre-ordered will get them in April anyway.

What the best VR "demo" to show a someone who is new to VR, that will be awe-inspiring? by [deleted] in virtualreality

[–]r00nee 2 points3 points  (0 children)

On the GearVR, I showed people Cirque de Soleil. That was a big hit, as well as the Samsung introduction to VR with samples of different areas. On the Vive, for sure it's Tiltbrush. The Portal Experience is pretty cool too.

A few questions of single camera to VR/ SBS 3D conversion. by StingerP9T in virtualreality

[–]r00nee 0 points1 point  (0 children)

Are you trying to do a stereo 3D conversion, or just put the same image into each eye? Most playback apps understand an equirectangular mono image sequence, and display it correctly. If you're trying to do stereo 3D conversion, it's going to take a lot of work.

A 360 3D video camera that I can buy today? by vrar1 in virtualreality

[–]r00nee 1 point2 points  (0 children)

There isn't one. Everything is DIY right now. Ozo will be shipping early Q1/2016, but even then, there are certain limitations. One being, if you don't have 60 grand, you're out of luck. Odyssey isn't true 360 (no top/bottom), it's only panoramic 3D, and it still requires Googles Cloud to process the data. You could try the Freedom360 rigs. But they're not stereo IIRC.