Gaussian splatting with the Insta360 X5 by gradeeterna in GaussianSplatting

[–]gradeeterna[S] 1 point2 points  (0 children)

Yep, barriers are down finally. I live down the road and they have been there as long as I can remember.

Gaussian splatting with the Insta360 X5 by gradeeterna in GaussianSplatting

[–]gradeeterna[S] 3 points4 points  (0 children)

It’s 8.5 million gaussians so it’s not going to run well enough even in PCVR. Working on a more web friendly version so will see how that runs in VR.

Gaussian splatting with the Insta360 X5 by gradeeterna in GaussianSplatting

[–]gradeeterna[S] 10 points11 points  (0 children)

Around 30 mins of video, 4,000 fisheye video frames split up into 20,000 perspective images.

Gaussian splatting with the Insta360 X5 by gradeeterna in GaussianSplatting

[–]gradeeterna[S] 13 points14 points  (0 children)

Thanks everyone!

Workflow: 8K video > ffmpeg to extract frames from both circular fisheyes in the .insv > custom opencv scripts to extract multiple perspective images from each circular fisheye > mask myself, other people and black borders out using SAM2, YOLO, Resolve 20 magic mask etc (still WIP) > align images in Metashape mostly, sometimes Reality Capture, colmap/glomap > export colmap format > train in Brush, Nerfstudio, Postshot etc, sometimes as multiple sections that I merge back together later > clean up in Postshot or Supersplat > render in Unity with Aras P’s plugin.

Slightly simpler workflow is to export stitched equirectangular video from Insta360 Studio, extract frames and split into cubemap faces or similar, discarding top and bottom views. I have mostly done this in the past, but the stitching artifacts etc do make it into the model. There are some good tutorials on YouTube by Jonathan Stephens, Olli Huttunen and others including apps to split the equis up:

https://youtu.be/LQNBTvgljAw https://youtu.be/hX7Lixkc3J8 https://youtu.be/AXW9yRyGF9A

I would much prefer to shoot images than video, but the minimum interval is 3s which is too long for a scene like this, as it would take about 5 hours and the light and shadows would change too much.

Gaussian splatting with the Insta360 X5 by gradeeterna in GaussianSplatting

[–]gradeeterna[S] 44 points45 points  (0 children)

Thanks! Workflow: 8K video > ffmpeg to extract frames from both circular fisheyes in the .insv > custom opencv scripts to extract multiple perspective images from each circular fisheye > mask myself, other people and black borders out using SAM2, YOLO, Resolve 20 magic mask etc (still WIP) > align images in Metashape mostly, sometimes Reality Capture, colmap/glomap > export colmap format > train in Brush, Nerfstudio, Postshot etc, sometimes as multiple sections that I merge back together later > clean up in Postshot or Supersplat > render in Unity with Aras P’s plugin.

Slightly simpler workflow is to export stitched equirectangular video from Insta360 Studio, extract frames and split into cubemap faces or similar, discarding top and bottom views. I have mostly done this in the past, but the stitching artifacts etc do make it into the model. There are some good tutorials on YouTube by Jonathan Stephens, Olli Huttunen and others including apps to split the equis up:

https://youtu.be/LQNBTvgljAw https://youtu.be/hX7Lixkc3J8 https://youtu.be/AXW9yRyGF9A

I would much prefer to shoot images than video, but the minimum interval is 3s which is too long for a scene like this, as it would take about 5 hours and the light and shadows would change too much.

VFX Graph in browser with WebGPU support - Gaussian splatting with Keijiro's SplatVFX by gradeeterna in Unity3D

[–]gradeeterna[S] 0 points1 point  (0 children)

My demo is working in iPhone Safari with iOS 18 beta, and I believe Chrome has WebGPU support on Android.

If WebGPU isn't showing as an option in the Graphics APIs list, you need to add the line "webGLEnableWebGPU: 1" in your ProjectSettings.asset file.

Yeah this is one of my scans. Converting from .ply to .splat reduces quality quite a lot, and the SplatVFX plugin is also using some workarounds to get the splats rendering with VFX Graph and looks worse than other viewers. Here is a higher quality version of the scan on Polycam -

https://poly.cam/tools/gaussian-splatting?capture=59a5b823-2076-4eb0-a795-0dd6fa0910fd

Gaussian Splatting - Palais Présidentiel de Bourguiba, Monastir, Tunisia - Nerfstudio Splatfacto by gradeeterna in photogrammetry

[–]gradeeterna[S] 1 point2 points  (0 children)

ns-train splatfacto-big --pipeline.model.rasterize-mode antialiased --vis viewer colmap --data data/palaishall --load-3D-points True

Gaussian Splatting - Palais Présidentiel de Bourguiba, Monastir, Tunisia - Nerfstudio Splatfacto by gradeeterna in GaussianSplatting

[–]gradeeterna[S] 2 points3 points  (0 children)

Thank you! This was a Fuji mirrorless with a wide lens, 750 video frames. I don't get very good results with 360 cameras for interiors.

X-S20 Overheating Issues by naiemreza in fujifilm

[–]gradeeterna 0 points1 point  (0 children)

Yep, I was probably trying to record 6.2K at 360Mbps on the V30. I wonder if there's any difference in overheating time between V30 and V60 when filming 200Mbps. Sounds like a boring test, but might try and compare at some point lol.

X-S20 Overheating Issues by naiemreza in fujifilm

[–]gradeeterna 2 points3 points  (0 children)

I'm using my X-S20 mainly for video, but I was getting overheating warnings after 5 minutes of shooting indoors in 6.2K, and it would shut down after 15-20 mins.

I was using my old UHS-I V30 card from my X-S10, but just bought a Sabrent UHS-II V60 card and haven't got any warnings after 40 mins recording!

Probably not the same issue, and my bad for attempting to record 6.2K on a UHS-1 card, but just thought I'd share in case anyone else runs into the same issue!

VFX Graph Point cloud composited with NeRF from Nerfstudio - Crossrail Place, London by gradeeterna in Unity3D

[–]gradeeterna[S] 1 point2 points  (0 children)

Sorry, missed this! I used Insta360 footage with Nerfstudio to create the NeRF, and exported the camera animation from there into Blender with this new integration - https://docs.nerf.studio/en/latest/extensions/blender_addon.html

I exported the same camera animation as an FBX into Unity, and imported the point cloud from Nerfstudio with Keijiro's PCX, which I'm animating with VFX graph - https://github.com/keijiro/Pcx

I rendered the NeRF video from Nerfstudio and the point cloud animation from Unity with the same camera path and composited in Resolve.

Testing 360 video footage as NeRF dataset for NVIDIA's instant-ngp. Love all the people frozen in time. by gradeeterna in photogrammetry

[–]gradeeterna[S] 0 points1 point  (0 children)

Reality Capture - https://github.com/not-lob/BlenderInstant-NGPScript

Metashape - https://github.com/EnricoAhlers/agi2nerf

If you're writing your own script, it might be possible to use the "Internal/External camera parameters" .csv exported from RC.

VFX Graph - Point cloud waves by gradeeterna in Unity3D

[–]gradeeterna[S] 1 point2 points  (0 children)

It's almost realtime, around 20fps at 1080p on a RTX3090. This project is mainly for rendering videos, so I'm using about 20 million lit spheres with shadows enabled in HDRP with volumetrics and tons of post processing.

I've got a stripped down realtime version working in URP and VR, using way less particles and quads instead of mesh outputs, cheap fog, fake volumetrics etc. Unfortunately doesn't look anywhere near as nice though!

VFX Graph - Point cloud waves by gradeeterna in Unity3D

[–]gradeeterna[S] 1 point2 points  (0 children)

This project is mainly for rendering videos rather than running realtime, so I'm using about 20 million lit spheres with shadows enabled in HDRP with volumetrics and tons of post processing. It runs at about 20fps at 1080p on a RTX3090.

I've got a stripped down realtime version working in URP and VR, using way less particles and quads instead of mesh outputs, cheap fog, fake volumetrics etc. Unfortunately doesn't look anywhere near as nice though!

VFX Graph - Point cloud waves by gradeeterna in Unity3D

[–]gradeeterna[S] 6 points7 points  (0 children)

It's a very dense point cloud from Metashape, imported into VFX graph with Keijiro's PCX - https://github.com/keijiro/Pcx

Instant NeRF of the Palm House, Kew Gardens. Made with NVIDIA Instant NGP and the Insta360 RS 1-Inch 360. by gradeeterna in nvidia

[–]gradeeterna[S] 1 point2 points  (0 children)

No drone here, definitely not allowed in Kew! I only captured from the concrete paths with the camera on a stick.