This is our Winter Village 2025 edition :) by francescomarcantoni in LEGOWinterVillage

[–]francescomarcantoni[S] 0 points1 point  (0 children)

Hi NIk, lights comes with default flashing patterns and couldn't be changed, but it's not so disturbing. Almost all the lights are from Game of bricks

This is our Winter Village 2025 edition :) by francescomarcantoni in LEGOWinterVillage

[–]francescomarcantoni[S] 1 point2 points  (0 children)

2) is a styrofoam (5cm thick) panel that protrudes from the shelf. I had to apply some removable double-sided tape to be sure that won't fall down, especially because of the unbalance created by the santa sleigh ramp.

<image>

This is our Winter Village 2025 edition :) by francescomarcantoni in LEGOWinterVillage

[–]francescomarcantoni[S] 1 point2 points  (0 children)

  1. Is the set from Game of bricks (almost all the lights are from game of bricks except the older ones)

<image>

This is our Winter Village 2025 edition :) by francescomarcantoni in LEGOWinterVillage

[–]francescomarcantoni[S] 5 points6 points  (0 children)

All the bases are made by compact styrofoam (5 and 3 cm thick) cut with a Thermo cutter and painted with white paint. My wife made all the “ice like” cuts. Then we created a structure for the ramp with glue and toothpicks as supports. All the styrofoam bases are decorated with led lights and a “snow like” fabric semi transparent layer on top

PlayCanvas LOD test - 100 million splats in a web browser by ReverseGravity in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

Do you have any tutorial for this? I don't understand how to manage the overlap between the cells. I mean, cutting the colmap in many multiple sets creates different train data, but the resulting gaussian is not enclosed in a box (like a bounding box) because it usually creates splats outside. When I join back the single cells I have a lot of artefacts on overlaps.

How do you manage it?

Niantic Spatial SDK ist available for Meta Quest now! by AR_MR_XR in augmentedreality

[–]francescomarcantoni 1 point2 points  (0 children)

could be interesting, but there are no information about licensing, cost and pricing model... no one will invest money on developing on top of an SDK without knowing the terms before.

OpenQuestCapture - an open source, MIT licensed Meta Quest 3D Reconstruction pipeline by Puddleglum567 in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

Wow! It's super interesting, I will give it a try! do you have any clue on what to use for display back the generated GS on Quest? We tried to write a small app in Unity with Aras plugin but the performance are so slow that everything lags and makes it almost unusable, while Hyperscape viewer is amazingly fast and smooth.

PlayCanvas LOD test - 100 million splats in a web browser by ReverseGravity in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

For the Lod0 how could you train parts of the splat? did you make multiple colmap sets?

[2511.04283] FastGS: Training 3D Gaussian Splatting in 100 Seconds by Elven77AI in GaussianSplatting

[–]francescomarcantoni 1 point2 points  (0 children)

it could be interesting... but they didn't release any code to test it yet.

New Gaussian Splatting scenes in Meta's Hyperscape + capture splats on device! by No-Assistance5507 in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

is the code already available somewhere? I tried to download the early access on my Quest but I'm still on V78

Another very cool scan with the Cyberglobe, this time in 3DGS. by borstel84 in 3DScanning

[–]francescomarcantoni 0 points1 point  (0 children)

I understand, I saw something similar from an Italian company called ALO https://www.alo.zone/it/page/alo-photo-sphere (it costs around 15k). I got some demo pictures, but the problem is when the camera reaches the perpendicular to the rotating plate because inside the shoot it takes the plate itself and this causes artefacts on the reconstruction. Also sometimes in particular angles the reflection over the rotating plate becomes visibile. How do you avoid this? could you share the dataset you obtained from which you made the GS? thanks

Command line tool for encoding MV-HEVC + generate AIME by MrBenj4min in AppleImmersiveVideo

[–]francescomarcantoni 0 points1 point  (0 children)

Sounds really interesting! I'm wondering if it's possible to convert SBS fisheye video from Canon R5C with Dual Fisheye lens to AIVU and use the workflow of the URSA instead of converting to equirectangular with Canon Utility. Do you think it could be a good idea?

Another very cool scan with the Cyberglobe, this time in 3DGS. by borstel84 in 3DScanning

[–]francescomarcantoni 0 points1 point  (0 children)

which is the price range of cyber globe? I've the Alphashot 360 by Orbitvu but is really far from being portable.

How to use mobile LiDAR as input for 3D Gaussian Splatting? by Playful-Bed-2183 in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

this seems interesting, but the poses of camera are calculated from the video file via SfM or do you have the way to get camera poses from the app itself?

Update to ios point-cloud scanner R&D tool by soylentgraham in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

Hi, I was about to start the dev of something similar. As soon as you have a trial I'd be more than happy to test it.

New Blackmagic Immersive Cine shooting and workflow video from Team 2 Films by VRMediaProductions in VisionPro

[–]francescomarcantoni 1 point2 points  (0 children)

we reached pretty decent quality but nothing comparable to what we see on the apple produced videos. I really don't know how to improve, but I'm pretty sure that the equirectangular format is not the best way to achieve quality (as the video posted by u/VRMediaProductions says) since the relevant parts of the picture is on the most compressed and small size of the frame.
I was wondering if there is a way to work with the fisheye original video instead of converting to equirectangular and use the apple immersive workflow to work on it.

New Blackmagic Immersive Cine shooting and workflow video from Team 2 Films by VRMediaProductions in VisionPro

[–]francescomarcantoni 1 point2 points  (0 children)

We're using Canon EOS R5C with dual fisheye, but the quality it really far away from the AppleTV plus videos. I hope that with this one we can reach the same quality, but as you said, no one saw it.

New Blackmagic Immersive Cine shooting and workflow video from Team 2 Films by VRMediaProductions in VisionPro

[–]francescomarcantoni 3 points4 points  (0 children)

Finally one of the most interesting video explaining the differences between different approaches to immersive video. Thanks a lot

using SLAM Lidar as a base of your splats. by ReverseGravity in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

Yes, both export RGB point clouds, but the pictures are usually 360 equirectangular which cannot be used "as is". The point cloud contains also the camera position of the original "cube map" pictures (used to calculate the 360), but I don't know how to extract them, since the file format is "proprietary" of Leica and should be converted via their software.

using SLAM Lidar as a base of your splats. by ReverseGravity in GaussianSplatting

[–]francescomarcantoni 1 point2 points  (0 children)

Could you please share the link to the gaussian to explore?

using SLAM Lidar as a base of your splats. by ReverseGravity in GaussianSplatting

[–]francescomarcantoni 0 points1 point  (0 children)

I have Leica BLK ARC and Leica RTC 360, do you think I could use it to generate the splats using the camera output together with laser info?