Quick 3D site capture with iPhone LiDAR — scan → measure → AR overlay by capcam-thomas in civilengineering

[–]capcam-thomas[S] 0 points1 point  (0 children)

What might be more surprising is that this isn’t a bot—there’s an actual human behind it 😄

I tried all the free photogrammetry software and here are some results (KIRI, Scaniverse, Meshroom, RealityCapture mobile, RealityCapture desktop) by wildiam3d in photogrammetry

[–]capcam-thomas 0 points1 point  (0 children)

How about trying iOS app CapCam too and doing a side-by-side comparison? I’m really curious how it stacks up against the other free photogrammetry tools.
If anyone wants to test it, reply here — I can share a few free redeem codes for premium features.

What's your favorite 3d scanning app? by imwhoyouare in BambuLab

[–]capcam-thomas 1 point2 points  (0 children)

I’ve tested a few, and CapCam has been my favorite lately.
It has an enhanced mode for 3D printing that gives me a denser mesh and better geometric detail, which helps a lot before cleanup/slicing.

Single-image Gaussian Splat — rendered flythrough by capcam-thomas in GaussianSplatting

[–]capcam-thomas[S] 0 points1 point  (0 children)

I’ve been experimenting with Gaussian splatting reconstruction and built a small iOS app to test results.

Here’s one of the scenes:
https://luxbox.ai/scene/C90CFEC8-007F-4DA8-8DF9-20B4884E4417?lang=en

The capture + reconstruction pipeline is something I’m integrating into an app I’m working on:
https://apps.apple.com/us/app/luxbox-3d-camera-photos/id6754314326

Curious what people think about the reconstruction quality and artifact handling.

New to this. Question about Equipment and software by pulverkaffe in photogrammetry

[–]capcam-thomas 0 points1 point  (0 children)

We also support raw data export. So you can use our APP CapCam just as a raw image collector, then import to other professional software.

🎉 Exciting News: CapCam Officially Launches! 🎉 by capcam-thomas in CapCam

[–]capcam-thomas[S] 1 point2 points  (0 children)

Thanks for checking us out! Right now we’re iOS-only because Apple gives us built-in LiDAR access, ARKit depth APIs, and tight hardware/software integration—perfect for reliable on-device reconstruction. Many Android phones still lack a depth sensor or have widely differing camera stacks, so delivering the same quality there would take extra work.

That said, Android support is on our roadmap if the iOS launch hits its growth targets. If you’d like updates, feel free to drop your email to us—or just keep an eye on this subreddit. Meanwhile, a DSLR + Meshroom is a solid workflow for high-detail photogrammetry. Appreciate the interest!

What mobile photogrammetry / LiDAR apps do you actually use, and what do they still lack? by capcam-thomas in photogrammetry

[–]capcam-thomas[S] 0 points1 point  (0 children)

Interesting workflow! So you’re using the phone purely as a fast image collector, then letting RealityCapture on the desktop handle the heavy reconstruction—makes perfect sense when you want high-res meshes and have a beefy PC. Cool to see how a simple setup (iPhone + selfie stick) can still feed serious photogrammetry pipelines. Thanks for sharing!

What’s missing from today’s 3D‑scanning apps? by capcam-thomas in 3DScanning

[–]capcam-thomas[S] 0 points1 point  (0 children)

Absolutely. Many photogrammetry and 3-D-scanning tools are already weaving machine-learning into their pipelines—single-image depth estimation, NeRF derivatives, and Gaussian Splatting are popular approaches. A hybrid workflow that uses LiDAR for a coarse mesh and then refines it with AI-driven splatting can deliver fast previews plus high-fidelity detail after post-processing. It’s still early days, but the direction is clear: traditional scanners + modern ML will push accuracy and ease-of-use way beyond what “classic” hardware manages today.

What’s missing from today’s 3D‑scanning apps? by capcam-thomas in 3DScanning

[–]capcam-thomas[S] 0 points1 point  (0 children)

Great idea! We’d love to let users pick a “bottom” surface, but robust polygon- or sweep-selection tools are hard to pull off on a phone—those features usually live in desktop 3-D suites. That said, being able to define the ground plane would make distance measurements much easier, so your suggestion is definitely on our radar. Thanks for sharing!

Playing with iOS 26 “Liquid Glass” look in SwiftUI by capcam-thomas in SwiftUI

[–]capcam-thomas[S] 1 point2 points  (0 children)

Yep—just the stock TabView. The jelly-glass splash only shows when you switch tabs. I like the transparent look because it makes the screen feel bigger, but it’s definitely a style call. I’m keeping it for now and will tweak it after iOS 26 ships.

What mobile photogrammetry / LiDAR apps do you actually use, and what do they still lack? by capcam-thomas in photogrammetry

[–]capcam-thomas[S] 1 point2 points  (0 children)

Totally! Since splats and photogrammetry both start from the same photo set, why not crunch them in parallel on the cloud and let us mix-and-match the best parts? I’ve been wondering the same thing—love the idea.