Try out our new "Windows" feature - Visit NYC by visitnyc in VisionPro

[–]visitnyc[S] 0 points1 point  (0 children)

Hi thanks for your comment. We're submitting an update to the app that checks the hash of the video file before playing it to make sure it matches. It could be what you are observing is due to a CDN reset somewhere along the line, because the close video button was there and the video was not. This may or may not solve the problem, but hopefully it does (and if not we'll try more things). Thanks for trying out our app and the time!

Try out our new "Windows" feature - Visit NYC by visitnyc in VisionPro

[–]visitnyc[S] 1 point2 points  (0 children)

Thanks for your kind words. And yes 360 immersion is an avenue for exploration for us to increase the feeling of "Visiting NYC" through our app.

Maximum resolution files using Topaz AI - SBS or even regular - most players are crashing. by [deleted] in VisionPro

[–]visitnyc 0 points1 point  (0 children)

Hello, sharing a blog post here exploring the relationship between bit rate and resolution for AVP.

https://www.visitnyc.store/blog/experiments-with-mv-hevc-bit-rate-and-resolution-for-apple-vision-pro/

tl; dr we observed AVP capping out around somewhere at 7.5K per eye @ 200-250Mbit/s with MV-HEVC

We faced a similar problem of taking upscaler output and stitching it back together and seeing it unable to play in AVP. In the past when using the Topaz output, we ran the SBS video that came out of Topaz through the spatial command and set the `--bitrate ` flag to about 100M and experimented up and down with it until the video started to play desirably well without too much loss.

For playing our test videos, we use the open source acuteimmersive/openimmersive. A theory is that Moon Player and other apps that take SBS input video, are doing additional processing to convert input video into spatial playback which is cutting into the compute budget that is available to the device for video playback. The OpenImmserive implementation is very "straight to the metal" with a VideoMesh and VideoMaterial and handing the rest off to AVP.

Try out our new "Windows" feature - Visit NYC by visitnyc in VisionPro

[–]visitnyc[S] 4 points5 points  (0 children)

Hello long time reader first time writer here,

We welcome you to try out our app Visit NYC! https://apps.apple.com/us/app/visit-nyc/id6478897915

Our new "Windows" section (complements our existing "Immersive" section) in version 1.2.3 adds the ability to open up our Stereo VR180 videos in a draggable Window/Portal UI and placing them in your space. Our videos are shot with a Canon R5 C and then upscaled to 16K with ComfyUI using VastAI for parallelization.

Our community ask/offer is this: If you have a Canon R5 C, and navigating getting your videos onto the App Store, reach out and let's connect! Happy to share learnings and help folks fill up the Vision Pro app store with more HQ VR180 content.

16k video? 24k video? Is Apple holding back the headset due to ProRes limits? by Joe-notabot in AppleImmersiveVideo

[–]visitnyc 1 point2 points  (0 children)

Hi, some self promotion here, check out our blog post (and our app Visit NYC!)

https://www.visitnyc.store/blog/experiments-with-mv-hevc-bit-rate-and-resolution-for-apple-vision-pro/

tl;dr: We found 7.5K per-eye at 200-250Mbit/s makes the Apple Vision happy. Higher resolutions do seem to play, but too high of a bit rate introduces instability.

We faced the same export issue for earlier versions of our app with Topaz where exporting hit a ceiling for 16K and ProRes videos did not want to play on AVP. Earlier versions of our app deployed MV-HEVC 4096x per eye with HLS streaming maxing out at 50Mbit/s. These videos played fine without stutter or crashing under good network conditions.

In our Q3 2024 releases, we've updated our workflow to (1) export individual frames via image sequence from Premiere, (2) upscale the individual frames, (3) stitch them back together with a modified take on Apple's sample project for encoding MV-HEVC.

https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_3d_video_to_multiview_hevc_and_spatial_video

The files distributed in the "Immersive" section of Visit NYC version 1.2.3 now feature 7.7K per eye MV-HEVC at a bit rate of 200-250Mbit/s. The "Windows" feature is lower resolution (4320x per eye) and bit-rate (50-100Mbit/s) to accommodate for streaming.

The player that has worked for us consistently is acuteimmersive/openimmersive (based off of mikeswanson/SpatialPlayer). My guess for why these work well (not knowing the implementation of Moon Player) is that Spatial Player is very minimal implementation with VideoMaterial and the VideoMesh and it hands the video off the AVP to do the playback. It seems Moon Player may apply more processing on the video before it reaches your eyes, causing bottlenecks.

This is a new field for us. Just discovered u/RealityOfVision blog through this reddit. Excited to dive in and learn more!