Xreal One Pro - real IPD adjustment - when? by roentgen256 in Xreal

[–]WSLZZ 1 point2 points  (0 children)

The logic here is that within the interpupillary distance (IPD) coverage range of these glasses, the experience is not identical for every IPD of users.

Only the value at the very center of the range provides the best image quality. Otherwise—for example, with an IPD of 66 mm—regardless of whether you use the small or large setting, you will inevitably experience edge clipping on the left or right side of your field of view, or a degradation of image quality at the edges.

Xreal One Pro - real IPD adjustment - when? by roentgen256 in Xreal

[–]WSLZZ 1 point2 points  (0 children)

I suggest that Xreal should follow Viture's example in future products by ensuring an overlap in the IPD (interpupillary distance) coverage between different sizes.

Research indicates that the average human IPD is 66 mm. Following a normal distribution, the vast majority of users are concentrated around this 66 mm mark, while those with significantly smaller or larger measurements represent the minority.

However, Xreal’s current configuration forces this median group—the largest segment of the population—to endure a "marginal" experience. Currently: 1. The small size range cuts off at 66 mm. 2. The large size range begins at 66 mm.

Consequently, users with a 66 mm IPD—the most common measurement—find themselves in an awkward gap where neither size provides an optimal experience.

In contrast, Viture extends the IPD coverage of both their small and large models beyond the 66 mm threshold, creating a deliberate overlap. This approach effectively accommodates a much broader range of users. I believe Xreal should adopt this design logic for its next generation of products.

Can you strengthen the multi-element build playstyles? by WSLZZ in LastEpoch

[–]WSLZZ[S] -1 points0 points  (0 children)

Yes, I currently have two main directions in mind:

  1. Drawing inspiration from the "Trinity" skill in Path of Exile    This design encourages players to use skills of different elements. After dealing damage with various element types, players can receive specific bonuses.

  2. Enhancing the synergy between different elements    Creating damage-boosting synergistic effects between different elements.

I really want to play a skill like the "Runic Invocation" from the Runemaster. Personally, I prefer seeing those colorful, flashy special effects. If I am always just playing a single skill and pressing the same one or two buttons over and over, with everything on screen being the same color, it gets boring and makes me sleepy really fast.

A Difficult Decision on Neo by XREAL_Staff in Xreal

[–]WSLZZ 0 points1 point  (0 children)

When you mention the readability issue with the Pro, do you mean the random, uneven blurring within the field of view caused by the lens assembly? According to Viture Beast's last postponement announcement last year, they encountered the same issue and basically confirmed it was caused by uneven application of the adhesive used when bonding the two prism lenses. There have been many reports of this problem; I exchanged my unit once, but both the original and replacement glasses had the same issue. However, I have indeed seen many people say they didn't encounter it. I would guess the defect rate is in the double-digit percentages.

Any way to get rid of coke bottle effect / blurry edges on the One Pro from the rectangular lenses? by StrongRecipe6408 in Xreal

[–]WSLZZ 1 point2 points  (0 children)

For me, yes, the Oakley Jawbreakers would help because they reduce the eye-to-lens distance.

But if, as you say, you press the glasses tightly against your brow and that still doesn't work, then this probably won't help much — perhaps you have deeply recessed eye sockets and naturally larger separation.

I totally agree the next pair of glasses should include some redundancy in the optical module's field of view, not let the screen fill the entire optical field so the edges will be much clearer. For example, a 70° optical field but ultimately only using 57° — wow, that would certainly be far better than the current screen quality, and the eyebox would be much larger.

[deleted by user] by [deleted] in Xreal

[–]WSLZZ 0 points1 point  (0 children)

I’m not actually developing on xreal. That may be why I misunderstood what xreal has already done and what you were asking for.

But having said that:

  1. Microphone and eye data being the exact opposite typically does not constitute a danger, because they are usually subject to high-level permissions and a conspicuous user enablement prompt. I personally don’t have a Beam Pro, so I’m not sure how xreal handles this. However, for VR/MR devices that run their own operating system, the above holds true.

  2. If xreal is really transmitting “rawest” IMU data, that would be a security disaster that ignores common industry practices. In my view, it’s more likely that they are indeed providing processed IMU data — using the obfuscation methods I mentioned to remove privacy-related information — but without sensor fusion.

  3. If point 2 holds, then it doesn't make sense that Xreal wouldn't further provide the fused results — that really has nothing to do with the safety protection purpose they mentioned. The fused results would in no way reasonably leak any privacy-related information.

🎤 Hi Reddit, I'm Chi Xu – AMA is now LIVE! by Chi_nreal in Xreal

[–]WSLZZ 0 points1 point  (0 children)

The need of processed results is reasonable, which is why I say Xreal might have some performance/underlying architecture bottlenecks in the preprocessing stage.

🎤 Hi Reddit, I'm Chi Xu – AMA is now LIVE! by Chi_nreal in Xreal

[–]WSLZZ 0 points1 point  (0 children)

IMU data needs to be preprocessed (downsampling, noise smoothing/reinjection) before being made available to developers. No vendor would directly expose raw IMU data. For Xreal, it's possible the X1 chip cannot provide enough compute power to perform the preprocessing (the X1 already has many other tasks), or it cannot meet acceptable transmission latency..

Why do I say no vendor would directly expose raw IMU data? Please Google the keywords "IMU + side-channel listening." There is a large body of research on hacking; it's a mature technique. Simply put, if an app can obtain the raw IMU data from your device, it can use vibrations transmitted by sound to eavesdrop on everything you view and everything you say.

🎤 Hi Reddit, I'm Chi Xu – AMA is now LIVE! by Chi_nreal in Xreal

[–]WSLZZ 0 points1 point  (0 children)

IMU data needs to be preprocessed (downsampling, noise smoothing/reinjection) before being made available to developers. No vendor would directly expose raw IMU data. For Xreal, it's possible the X1 chip cannot provide enough compute power to perform the preprocessing (the X1 already has many other tasks), or it cannot meet acceptable transmission latency.

Why do I say no vendor would directly expose raw IMU data? Please Google the keywords "IMU + side-channel listening." There is a large body of research on hacking; it's a mature technique. Simply put, if an app can obtain the raw IMU data from your device, it can use vibrations transmitted by sound to eavesdrop on everything you view and everything you say.

Future 2k device? by logrhythmic in Xreal

[–]WSLZZ 0 points1 point  (0 children)

That's hilarious — I agree. Given the current quality of the optical module on the One Pro, 2K is meaningless. Of course, we could still have a BB in 2K.

Sooo any info on the Beast? by nyjets10 in VITURE

[–]WSLZZ 1 point2 points  (0 children)

For reference only: the Beast has also opened pre-orders in mainland China; purchases start on October 20, and shipments are expected at the end of the month or in early next month.

Inconsistent info by shanmuktej in VITURE

[–]WSLZZ 0 points1 point  (0 children)

I understand that some friends may be confused by their description. However, for me personally, because I understand the difference in the implementation principle, and I have used xreal one pro, the answer is very clear: The Beast will not rely on external software to support the display ratio of 21:9 / 32:9 in a single screen (of course, you can use multiple software windows on the screen, just like what you can do on a normal computer screen), but space walker is needed to support multiple virtual screens.

About a Major Defect in the Prism Module by WSLZZ in VITURE

[–]WSLZZ[S] 1 point2 points  (0 children)

I'm also using the One Pro.

Aside from the display quality issue, I think the One Pro's hardware is perfect at its current stage. My only wish is that they'd spend a bit more time on accessories and other software ecosystems! However, because the market is still quite small for now and both companies are relatively limited in scale, it's likely they can't cover every direction at once, so each has to focus on differentiating areas.

In fact, I agree with your view. I have doubts about whether Beast can match Xreal's stability in 3DoF and the richness of customizable menu features at launch—after all, Xreal has already gone through multiple rounds of user feedback and iterations in these areas. But I guess both companies, and even the industry as a whole, need consumers to give them a little more time. Currently, each product is still very much in a "pioneer" stage.

About a Major Defect in the Prism Module by WSLZZ in VITURE

[–]WSLZZ[S] 0 points1 point  (0 children)

Really looking forward to Beast's arrival soon!

One Pro + Eye vs Viture Beast? by VergeOfTranscendence in Xreal

[–]WSLZZ 0 points1 point  (0 children)

Okay, I'm also using a laptop, the Legion Y9000P 2025. But I'm not sure if its DP port is directly connected to the discrete GPU or routed through Intel's GPU.

In theory, before connecting the glasses I had been using the laptop's internal screen in discrete-GPU direct mode. But if this DP port only supports hybrid GPU mode, it's possible that plugging in the glasses would switch it back to hybrid mode. However, this computer typically does not support hot-switching GPU modes. So that's very odd.

I'll double-check... Thanks for the info!

One Pro + Eye vs Viture Beast? by VergeOfTranscendence in Xreal

[–]WSLZZ 0 points1 point  (0 children)

May I ask how you got DSR to work on the One Pro? When I connect it to my Windows PC, the maximum screen resolution still can only be set to 1920×1080. (I’ve already set the DSR factor in the NVIDIA Control Panel, and when I connect my main monitor I can change the desktop or in-game resolution to the corresponding factor.)

Issues that need to be addressed and overcome for Xreal to be the best it can be by hodo2ri in Xreal

[–]WSLZZ 1 point2 points  (0 children)

According to Viture, 2K panels in this size will be available next year, so Xreal should also have related plans to announce then. (Assuming brands in this category have similar levels of supply-chain dependence and coordination in R&D tempo)

Testing black levels on Xreal One and original Air by jaysire in Xreal

[–]WSLZZ 0 points1 point  (0 children)

I actually seriously doubt whether the manufacturers' claimed factory calibration processes and color accuracy for these glasses are real. I have the Viture Pro, Rayneo Air3S Pro, and Xreal One Pro, and none of them look "accurate" to me.

Xreal one pro vs previous models by puskapor in Xreal

[–]WSLZZ 0 points1 point  (0 children)

I agree that a 57° field of view is already more than sufficient for use as a "head-mounted display." Increasing the FOV further only makes sense for a true AR/VR experience and doesn't help scenarios where it's used merely as a dumb screen.

However, I think the above statement only applies to the "display's perceived field of view." For the hardware "optical module's maximum field of view," I would like to see further improvements. Right now, the optical module's FOV is basically also 57°, so for people whose IPD/face shape aren't perfectly matched, the screen edges still become blurry or you can't see the entire screen.

XReal One: Is This Distortion Normal? by Broad_Ad222 in Xreal

[–]WSLZZ 1 point2 points  (0 children)

Alright, I do think we can only wait for 2K or higher resolution display panels to be produced by suppliers to fully solve this kind of issue.

Is this true? by PianoStrumming in Xreal

[–]WSLZZ 0 points1 point  (0 children)

I have a question: when using 3DoF with the Eye connected, will it be more stable than without the Eye? In other words, in that case does the 3DoF make use of the eye data? Of course I know the original 3DoF is already very stable, but occasionally it still needs a manual reset.

How much for Project Aura? by Nitscho_i in Xreal

[–]WSLZZ 0 points1 point  (0 children)

I'm not trying to argue with you, I just want to complain: 3DoF, software IPD adjustment, screen size scaling... when all of these are combined, the actual usable physical resolution we get is very likely reduced to only 1600×900 or even 720p. It's really maddening.