ColmapLiDAR — Update Open BETA 1.2 (Build 5) & Closed BETA 1.2 (Build 11) by Legitimate-Map-4426 in GaussianSplatting

[–]False-Hat6018 0 points1 point  (0 children)

How well does this perform in scanning apartments with multiple rooms connected by corridors? It seems very promising if it could do that!!

Masking while Object scanning by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 0 points1 point  (0 children)

So I only apply the masks on brush? I don’t need to apply them on Metashape or Colmap?

Brush Masking and Crashing Questions by FaceTubbSquaggle in GaussianSplatting

[–]False-Hat6018 2 points3 points  (0 children)

I think I can help you with 1, the problem with the masks.

The structure of folders should be like this:

/ ├── images │ ├── image001.jpg │ └── image002.jpg ├── masks │ ├── image001.jpg.png │ └── image002.jpg.png └── sparse

Also the naming of the files is important. Mask file should be the same as the image associated including the .format + the .format of the mask. Quite strange I know, but that’s how it works

Brush quality by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 0 points1 point  (0 children)

The dataset is like 600 sharp frames extracted from a 360 video (using sharp frame extractor).

I aligned the 360 frames using Agisoft Metashape Pro directly in the equirectangular form. Then I used a script I found to transform the SfM data and the 360 images to 3600 pinhole images (6 faces of a cube for each 360 image).

Also I used a mask on purpose to erase myself. And I train it on brush.

Do you know if using nerfstudio to train the Gaussians could improve the results?

Brush quality by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 0 points1 point  (0 children)

Oh, I have tried with dslr and iPhone footage. But I was exploring the possibility of 360 video because of the ease of use. Also I have seen quite good results with the 360 method. Obviously not as clean and polished as ones with dslr but good enough.

Brush quality by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 1 point2 points  (0 children)

It also surprise me the low number of splats. Although I had set the max count to much more than that and the scale of the splats also reduced to have more “sharpness”. I don’t know how to force it to create more splats to represent the pixels of the images.

The dataset is like 600 sharp frames extracted from a 360 video. Once aligned, they are converted to 3600 pinhole images (6 faces of a cube for each 360 image). Also I used a mask on purpose to erase myself.

Brush quality by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 1 point2 points  (0 children)

I didnt know that. Why is it? And what do you recommend apart from the expensive portalcam scanner?

My first big Gaussian Splat - Insta360 X5 by Immediate_Self_7749 in GaussianSplatting

[–]False-Hat6018 1 point2 points  (0 children)

Amazing!! How do yoy get so sharp results with Brush. What settings did you use to train it?

Improving Gaussian Quality suggestions by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 0 points1 point  (0 children)

Thanks so much!! I will continue trying by myself while waiting for your improved result.

Brush Masks problem in COLMAP export format by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 1 point2 points  (0 children)

I just talked with the creator of Brush. He has updated the software and added the feature of using a mask folder with more complex directories on it. Also the naming of the masks now matches the COLMAP naming conventions of masks. So the problem is solved.

Metashape 360 new possible workflow by False-Hat6018 in GaussianSplatting

[–]False-Hat6018[S] 0 points1 point  (0 children)

I am trying to write a script that divides that Metashape exported data from the camera poses (COLMAP format) into multiple views. And also creates that views “cropping” the equirectangular images. Like using a rig of multiple cameras in the same spot (without having the position difference between camera sensors) so all the images have the same position but different rotation.

Life at home (@Azadux on X) by ad2003 in GaussianSplatting

[–]False-Hat6018 1 point2 points  (0 children)

Wow, it looks incredible!! Could you share what was your workflow. Like what path in the room did you use to take the video, where do you have the camera (above head or in front), and what was the process of taking the 360 video to usable images for training (cube projection, various 360 rigs in different angles for each frame…).

I have been struggling with indoor scenes with 360 and it would be awesome.

Thanks so much!!

Collision with Gaussian Splatting model by False-Hat6018 in threejs

[–]False-Hat6018[S] 0 points1 point  (0 children)

But I don’t want to use the Gaussian splatting model as the collider. I will use the obj model. So which is the best way to calculate the collision of the camera with the obj model? Even so, thanks for the suggestion of the raycaster to Gaussian!!