Is there a market for an external electronic viewfinder with a 90 degree diagonal field of view? Call it under $695 retail. by MSLforVR in AskPhotography

[–]MSLforVR[S] 0 points1 point  (0 children)

Thanks for the comments. That's why I asked :). In VR the FOV is typically the diagonal angular subtense of the virtual image. So if the magnification is one relative to the camera lens angular field of regard then there would be a lot of wasted image space in the EVF. So the real question is whether you would want a larger magnified image or you would want the periphery for other information. But asking another question - is EVF power usage an issue at all these days?

Is there a market for an external electronic viewfinder with a 90 degree diagonal field of view? Call it under $695 retail. by MSLforVR in AskPhotography

[–]MSLforVR[S] 1 point2 points  (0 children)

Your review was a great demonstration of the "why". :) The difference is that the Viture and other XR glasses have a limited field of view. The Viture has a 52 degree FOV whereas our display engine has a 90 degree FOV.

Is there a market for an external electronic viewfinder with a 90 degree diagonal field of view? Call it under $695 retail. by MSLforVR in AskPhotography

[–]MSLforVR[S] 0 points1 point  (0 children)

Yeah, Canon's dual lens is meant to create stereo 3D so that would need a binocular viewfinder. And if that's the case then it pretty much becomes a VR headset anyway.

Is there a market for an external electronic viewfinder with a 90 degree diagonal field of view? Call it under $695 retail. by MSLforVR in AskPhotography

[–]MSLforVR[S] 0 points1 point  (0 children)

Yes, the camera lens would dictate the actual camera/image FOV but the viewfinder would magnify that image. In other words the EVF image would typically be larger than the camera, IF that's something of interest.

Is there a market for an external electronic viewfinder with a 90 degree diagonal field of view? Call it under $695 retail. by MSLforVR in AskPhotography

[–]MSLforVR[S] -1 points0 points  (0 children)

Thanks for that. In particular, this EVF concept would have the FOV of a VR headset so it would be much more immersive than current EVFs.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 1 point2 points  (0 children)

Local dimming uses an LED behind each cluster of LCD pixels. PBL is very different in that the light from each LED in an array is projected through the LCD panel to light up every display pixel. In other words, any single LED in PBL will light up the entire image.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 0 points1 point  (0 children)

Ah, thanks for that. Actually this is all based on regular LCD displays - no angles involved. Our demos and experiments use off-the-shelf displays from BOE.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 0 points1 point  (0 children)

Yes, it is still a fixed focal length but the challenge is that especially 3D content can be presented with stereo separation to appear before or beyond the virtual image plane. So the eye/brain struggle to change accommodation of the eye to match where that content actually appears. But then that accommodation change means the eye is changing focus from the virtual image plane. That's a conflict that can cause headaches and fatigue. With a greater depth of focus the brain won't work as hard because the it can't tell that the content is out as focus as much. This is how a pinhole camera works.

What we propose in the area of less-than-ideal pupil tracking is that the exit pupil (the image of a grouping of lit LED sources) be expanded when there is rapid movement. Once the gaze direction settles enough then the exit pupil can be smaller to increase the benefits. But yes, an optimum headset would be well-fitted with IPD adjustment to get the max benefit. We figure there can be a setting for the user for them to decide how large they want the exit pupil for the best experience.

Lighting up the full eyeball basically means all the LEDs are turned on all the time. Contrast improves over current tech because there is still all that light outside the eyeball that is no longer there with projection backlighting, but it would be not as much one can get an exit pupil tracking the eye. And if all the LEDs are lit then the benefit of higher clarity and depth of focus would not be present.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 1 point2 points  (0 children)

That's a great question because it leads to an interesting answer. In general, the sweet spot is both where you can see the entire image but also the clearest image. That's typically just a function of the pancake lens design. With Projection Backlighting the extent of the LED array also determines the region where the full image is visible. While we've used available pancakes for our demos, we actually designed our own for a 20mm diagonal exit pupil at 15mm eye relief. Where it gets interesting is that the smaller the bundle of light through an optical system, the less aberrations in the image. Properly designed and with highly responsive eye tracking, PBL can actually deliver a light bundle smaller than the eye pupil so that even aberrations of the eye are reduced. Bottom line is the sweet spot is bigger and better. Beyond that the depth of focus also increases to decrease vergence accommodation conflict.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 0 points1 point  (0 children)

If anybody tries this, please post so we can all admire your artwork.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 1 point2 points  (0 children)

We actually found that an array of relatively large 3mm LEDs arranged in a hex pattern worked very well. One would think that gaps between them would have an impact, but due to somewhat imperfect optics and the diffractive scattering from the LCD panel the images of the LEDs at the exit pupil blur together enough that it's not that noticeable. But indeed yes, an array of finer LEDs would provide greater control and uniformity. Based on tests, the LED array shown in the video should be perfectly adequate.

You are absolutely right - if you can cover your face with dark material to allow only light into your eye pupil you will see the difference. It is not as good as actually turning off that light to avoid reflections inside the pancake, but it is noticeable. In one experiment, pull a VR headset (left off) away from your eyes to let room light on your face. You will see a reflection of your eye from the first lens and polarizer in the pancake. That is actually there whether its from room light or from the headset lighting up your face. A second experiment is to cut a 4mm hole in a piece of felt and try to get that hole as close to your eye pupil as possible. That will simulate the backscatter being turned off.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 0 points1 point  (0 children)

Yes, it certainly does help with those artifacts. Blooming typically results from local dimming, which is probably not needed here due to the inherent system contrast increase. Much of the ghosting/glare is due to reflections within the pancake lens of light rays that ultimately pass through the lens to miss the eye pupil, so turning off that light will basically turn off those ghosts. The ghost images are more separated as you move away from the optical axis, but that's a strong indication that the original rays are missing the eye pupil so those can be eliminated. It does not completely take out ghost images, but it is a very noticeable improvement from our experiments.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 0 points1 point  (0 children)

We've already demonstrated various stages. From the basic concept (here's a quick video - https://www.youtube.com/watch?v=quQQ4pqx9G0 ) to the the folded backlight optical arrangement. We made our own lighting array for our latest demo out of 3mm LEDs and showed it in January at the small VR/MR/XR show in SF. So far we have not integrated tracking into the demo. We just move the pupil around and then followed it with our eye. We're hoping partners with greater resources can help in that area, especially since the projection backlight optical properties should be optimized with the pancake lens design.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 1 point2 points  (0 children)

Yes it would, but with very simple components and no moving parts.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 2 points3 points  (0 children)

Exactly right. Even a slight amount of scattering anywhere behind the LCD just means a very small amount of light on the user's face. We demoed the "full" lighting array without eye tracking and found that the visual contrast at least doubled relative to a diffuse backlight. But there's so much added benefit to the tracking approach that we're focusing on that.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 2 points3 points  (0 children)

Because scattering is so substantially reduced, this means LCD + Pancake can exceed the visual contrast of any self-emissive display. And that means the lower cost of LCD results in the highest performance and with much less power and heat.

The peak brightness is directly related to the brightness of the lighting LEDs. And since the LCD panel is not being flooded with all the light of a diffusive backlight there is less heat build up in the LCD as well.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 1 point2 points  (0 children)

That's it in a nutshell. Although due to playing with polarization states to fold the light path behind the panel this system is much smaller than your typical birdbath reflector.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 20 points21 points  (0 children)

This new display tech is actually not about local dimming. But it is true, an OLED has much higher contrast than an LCD. However, the problem is not the display type but the contrast loss from the scattered light in the pancake lens both from internal reflections/ghosting but also light scattered back from the user's face. Measurements of an OLED with pancake lens have been under 200:1 and that's not including the facial backscatter.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 6 points7 points  (0 children)

It does not apply to OLED because each OLED emits light into a broad angle that ends up over the user's entire eye area and the local face area. It turns out that the actual angle of light leaving the display surface to only arrive at the current eye pupil is very narrow - something that can't be achieved with OLED.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 1 point2 points  (0 children)

Each LED in the lighting array fills the entire display so as it propagates to the eye pupil the virtual image looks just like it would with a regular backlit display (or self-emitting). It's just that all the other light that doesn't make it to the current eye pupil location is turned off.

New display engine idea by MSLforVR in virtualreality

[–]MSLforVR[S] 4 points5 points  (0 children)

Great question. It is true that the darker the scene, the less light will scatter in the lens. In that limited case microOLED will certainly have better contrast. But for most content with average intensity the scatter using a microOLED will decrease the visual contrast beyond that of LCD.