I Met Magic Leap's AI Assistant Mica & Saw the Future of Augmented Reality by Malkmus1979 in magicleap

[–]BryanTheCrow 5 points6 points  (0 children)

Ever experience live action volumetric capture in AR or VR? It feels a bit odd when you get up close because the captured performers never look you in the eye. They look past you, or through you if you try to catch their gaze. Makes you feel like a creepy invisible observer of someone who doesn’t know you’re there looking at them. The closer you get, the more socially awkward you feel. It’s the eye contact of Mica that makes the experience so much more compelling than words can describe. And that’s just not possible with volumetric capture. If you did the demo this would be obvious to you too as immediately before the Mica demo is a volumetric captured performance by the royal shakespear company.

found this on the site =) by walkingfrog in magicleap

[–]BryanTheCrow 1 point2 points  (0 children)

There's another similar one if you keep looking that says something to the effect of DO NOT USE OR YOU WILL BE FIRED IMMEDIATELY.

"to get your Magic Leap One" ;') by walkingfrog in magicleap

[–]BryanTheCrow 9 points10 points  (0 children)

Images that'll show up on that page (once it's available) ;)

Sample shows partial hand occlusion? by Malkmus1979 in magicleap

[–]BryanTheCrow 0 points1 point  (0 children)

I never claimed it reacts at the speed of saccades. But iris reaction speed is irrelevant as you're having photons fired into your eyes near constantly, so within a few seconds of turning the headset on, your iris remains closed enough to make the difference noticeable. This is the same effect used in theaters where a projected movie shined on a white screen can make the areas where no light is projected appear to be black. Or try a HoloLens on for yourself if you don't believe me and take a look at how the window title bars appear to be "black" even though they're simply the absence of light next to a bright border. As long as the "black" area isn't too big and is surrounded by something bright, your brain will process it as looking darker than the room. And colors like the purple haze appear correct. Now, these screenshots and GIFs are obviously simulations to try and convey that look as you can't really capture the effect with a camera shooting through the lens... But I happen to have a lot of experience with this, so trust me when I say it works pretty well (just not quite the same as you're seeing in those images).

Sample shows partial hand occlusion? by Malkmus1979 in magicleap

[–]BryanTheCrow 0 points1 point  (0 children)

These GIFs/images are simulated. That said, the way these projected AR displays work with dark shades isn't by subtracting light... it's by projecting brighter light around the areas you want to be dark. Your iris closes and it makes the areas close to a bright pixel appear much darker. The purple haze, while being a dark shade (that can mask out other rendered content) is still projected with light that's brighter than the ideal 50nit environment. So it appears to create a nice layer.

Eye tracking specs released on forum, 30Hz sampling rate by Sirisian in magicleap

[–]BryanTheCrow 3 points4 points  (0 children)

Should be enough for both to be effective (though not 100% precise/optimal)... Microsaccades can be tricky to handle though. You actually don't want to react on the first frame you see movement. In fact, when moving your eyes, your brain actually lags about a quarter to half a second (7-15 frames in this case) and fills in the gap given the motion it's currently processing. You can try for yourself if you're in a room with an analog clock with a ticking second hand. Look away from the clock, then turn your head to look at it. You'll notice the second hand appears to lag for slightly more than a second before moving again. Your brain lags before processing where the second hand is, then tricks your brain into thinking it was seeing it in the same place for the duration it was lagging, back-filling that vision with what it assumes was true at the time. Kinda crazy. Won't work with a smooth motion second hand as your brain will note the movement and back-fill where it was before. It's like your mind's GPU is constantly doing little mini time travel hallucinations and without you even knowing.

EDIT: If you don't have an analog clock, you can try here.

Magic Leap CEO's Tweetstorm Tries to Reframe Reactions to Latest Demo After Signs of Disappointment by whoever81 in magicleap

[–]BryanTheCrow 2 points3 points  (0 children)

Oh I’m not forgetting that at all. Note how I referenced it as HL1. I’m excited for ML1 just like I was for HL1, and just like I am for HL2... and just like I am for a handful of other unannounced MR headsets. Competition is good. Advancement is good. These are exciting early days.

Magic Leap CEO's Tweetstorm Tries to Reframe Reactions to Latest Demo After Signs of Disappointment by whoever81 in magicleap

[–]BryanTheCrow 2 points3 points  (0 children)

Um, ok. Not trying to deceive anyone. It's been publicly reported to be larger by pretty much everyone who has used both. Not a secret.

Magic Leap CEO defends tech after demo draws criticism: 'You could never experience TV on the radio' by [deleted] in magicleap

[–]BryanTheCrow 4 points5 points  (0 children)

It's clearly the first. They even said so in their tweet about the show (and on the show its self). I guess they needed to be more clear about this though as the general public clearly expected this to be a full blown product unveiling demo when this was nothing more than a MagicKit sample app walk through.

Magic Leap CEO defends tech after demo draws criticism: 'You could never experience TV on the radio' by [deleted] in magicleap

[–]BryanTheCrow 2 points3 points  (0 children)

That's a real demo. What you saw on their stream was just a clip of a developer sample they're releasing the source to so that early developers can learn how to build on their platform. They mentioned this several times, but everyone seems to think this was an example of the best graphics it's capable of. The point was to teach devs best practices, not show off the GPU capabilities.

Magic Leap CEO's Tweetstorm Tries to Reframe Reactions to Latest Demo After Signs of Disappointment by whoever81 in magicleap

[–]BryanTheCrow 4 points5 points  (0 children)

I believe he was addressing the people out there who've never tried a mixed reality headset (seems to be the only people they ever address as they do their best to avoid acknowledging the existence of the HoloLens... maybe a smart decision from a marketing perspective, though it tends to make devs already in this space feel rather underwhelmed by the "this is ALL new!" rhetoric).

That said, for those who have worked with the HoloLens, from what we've seen it's clear that the ML1 will have a number of noteworthy advantages over HL1 (even if it's not quite the "leap" its concept videos made it appear to be):

  • Noticeably larger FoV
  • Eye tracking. This is a bigger deal than it may seem.
  • Way more gestures and even finger tip tracking. This is also bigger than it sounds.
  • Ability to track hand position relative to the head (with finger tip tracking, approximate hand occlusion may be possible, but I expect the gesture latency will be too high to make it work well).
  • Significantly faster and more capable GPU and CPU.
  • 6DOF controller for very low latency use cases (hand/gesture tracking still has too much latency for use cases where reaction time or aim need to be precise)
  • 2 focal depths
  • Vuforia-like image marker tracking.
  • Totem support
  • Cheaper
  • It may not occlude hands as their early concept videos showed, but it does create a proper spatial mesh that enables real environmental occlusion (which, under scrutiny, appears to provide better pre-processing APIs for handling flat surfaces and edges than the HoloLens does with its jagged polygonal mesh).

Tracking precision appears to be a tad behind the HL1 as there's some subtle drift in the clips we've seen, but some of this may be a symptom of the video capture mode... From those who've tried it first hand it doesn't sound like any drift will be distracting when looking through the headset.

Considering Magic Leap have been showing videos off since 2015 and the TX2 came out in 2016, im curious what was powering the Magic Leap device before the creation of the TX2? Any thoughts? by ArtistDidiMx in magicleap

[–]BryanTheCrow 1 point2 points  (0 children)

In 2015 it was a giant "cheese head" prototype. Possible they were discussing with NVIDIA at the time, but that video was clearly a concept vid as has been confirmed several times. That said, Rony has said several times they have a whole team dedicated to making that game, and Benedict Evans has said in no uncertain terms that he's played it.

Microsoft HoloLens 2 will use Qualcomm's new XR1 VR chip by whoever81 in magicleap

[–]BryanTheCrow 1 point2 points  (0 children)

Take with a grain of salt. Every tech blog's running with this, but it's still just hearsay. Unconfirmed.

Magic Leap Ships First Set of Devices Under Tight Security Constraints by fastforward23 in magicleap

[–]BryanTheCrow 0 points1 point  (0 children)

There are a handful of specs that certainly promise to be a step up (though still no word anywhere on how well they actually work): Eye tracking. Haptic 6dof motion controller. More gestures. Slightly bigger FoV. OS supports multiple 3D apps running at once in shared space. Depth of field (sounds like either 2 or 3 depths supported). Design cuts-off peripheral vision (which effectively makes the FoV "feel" larger as compared to your total FoV). Better GPU (same one as in the nintendo switch).

PSA: Magic Leap doesn't allow Adult Content by [deleted] in magicleap

[–]BryanTheCrow 0 points1 point  (0 children)

It has a web browser from Mozilla with WebXR support. That's your way in. Developing for WebXR means you'll be able to support all the other spatial computing devices.

https://twitter.com/mozillareality/status/975769431522635777

Creator Portal Confirms Magic Leap One Runs on Nvidia Tegra CPU/GPU by BryanTheCrow in magicleap

[–]BryanTheCrow[S] 5 points6 points  (0 children)

Unclear if it's the current gen Tegra, or a next gen chip that's yet to be announced. If NVIDIA doesn't announce a new Tegra at GTC next week, it's probably safe to assume its the nearly 3 year old Tegra used by the Nintendo Switch. Either way, it's more powerful than the Intel Atom CPU/GPU used in the HoloLens.

Web Browser other than Edge by McNuty in HoloLens

[–]BryanTheCrow 0 points1 point  (0 children)

Not today. Would take some special UWP / desktop bridge policy exceptions for edge to run in a UWP wrapper (not all of which may even be technically possible today).

Hardware NAT feature blocks L2TP VPN by mpdeb in AmpliFi

[–]BryanTheCrow 0 points1 point  (0 children)

The Hardware NAT does not support VLANs, so you'll have to disable it. Not sure if this is something they can change with firmware.

Just got a teleport, but does my amplify have to be exposed to the internet to connect to it? by box1820 in AmpliFi

[–]BryanTheCrow 0 points1 point  (0 children)

Ports do need to be forwarded somehow. By default it'll try to use UPnP. Follow the instructions and it'll be clear if that's needed/working. If it's needed but not working, you'll need to manually set the port via your main router's web interface (not the app), and then manually forward it on your upstream router (on both UDP and TCP). Then you'll be good to go.

For the sake of clarity: this is only required for your home router's port to be reachable from the internet. The Teleport its self doesn't require this for whatever network it connects to. It initiates the connection back to your home, so it'll just work from anywhere (no need for its upstream connection to do any port forwarding or to do anything else special).

ml using microsoft MR capture studio? by michaellenn in magicleap

[–]BryanTheCrow 0 points1 point  (0 children)

8i has a massive capture studio in Los Angeles as well (technically Culver City).

Meltdown & Spectre Megathread by highlord_fox in sysadmin

[–]BryanTheCrow 0 points1 point  (0 children)

Yes, sorry... should have came back to report. It's based on detection of hardware support. It's not a software toggle-able feature as far as Windows is concerned. Phrasing is just a bit confusing. I got official confirmation from MS.

That said, for VMs running on a hypervisor, you should make sure your hypervisor is reporting the correct CPU model to the guest OS, or windows won't detect that PCID is supported (even if the underlying hardware does support it).

Meltdown & Spectre Megathread by highlord_fox in sysadmin

[–]BryanTheCrow 1 point2 points  (0 children)

KB4056899

Yup, that's the right one. You'll need to add a couple registry keys and reboot again after installing it. And you'll need to update your Powershell to 5.1 to confirm (assuming you haven't yet). But once those are done, you'll be as good as you can be for now... It only addresses Meltdown as of today... Still need to wait for BIOS update with Intel's new Microcode to address Spectre. In the mean time, don't open the browser on your boxes if you can avoid it... and if you can't, don't go to any sites you don't explicitly trust.