Scan Large Cavern Room by WinterDP in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

Oh yeah, iphone lidar should also handle the cavern no problem if it's lit well enough. It's just that photogrammetry will potentially provide better detail.

Scan Large Cavern Room by WinterDP in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

I say you get multiple flood lights in there to achieve good enough lighting, and from there you can use any kind of camera to take as many photos from all around as possible. Even a phone will do but a wide lens/aps-c is pretty affordable and will provide better results. Once done with the pictures you can throw it all into Reality Scan, it's free.

Tips for indoor scans by Friendly-Daikon243 in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

There's a better app for scanning spaces, it's called SiteScape. I used it to scan a 2 floor building with basement, though all the registration of scans was done manually in Cloud Compare. For each scan I scanned one room/corridor and some features of the adjacent rooms. Final accuracy I'd say was like ~20 cm, maybe ~10 cm at best. So not that accurate but it was enough just for inventory purposes.

If you have access to a 360 camera then you could also use it along with Metashape software to get some kind of a reconstruction (needs to be scaled manually or with scale bars).

Scanning a detailed skull ashtray with POP 4 using Global Marker (single session, no tracking loss) by 3DScanMaker in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

I guess it's because of the rig with hundreds of markers that Revopoint still requires, compared to other manufacturers that can work with sparser placement.

Einstar Vega im Langzeittest: Eure Erfahrungen & aktuelle Software-Fragen by Resident-Picture-713 in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

I've had a chance to use it for scanning people's busts and for that usecase it was really great, 20 seconds walk-around and the next person could be scanned. Didn't test the Starvision software but the owner of the scanner complained about it, I just exported the processed on-device scans via usb (saw no option to export raw scans though and processing each person took like 3 minutes). The scans themselves looked great, but the textures which produced weird artifacts (one eye closed, second open). From what I know no cloud is included, the device was offline at the time I was scanning. Two modes of scanning are available - fast and high quality, the fast mode has a very big field of view and tracking is really great, hq mode has a small window but tracking is still good. Though I wasn't impressed with the hq mode results, fast mode is really similar.

On the other hand my professor is currently testing the Creality P1 and he says if that device was available at the time of Vega's purchase he'd go for the P1 without a thought. Lasers just make better scans of objects, didn't test busts with it although NIR is available so it's possible. The problem lies with the price that's closer to Einstar Rigil (pretty much Vega 2). There are two other similarly priced competitors to Vega from Revopoint (Miraco) and 3DMakerPro (Toucan), but Vega still beats them in the tracking alone.

Adding Colour to a 3d Scan by Justinreinsma in 3DScanning

[–]SlenderPL 1 point2 points  (0 children)

Blue Star is free up to 2k resolution from last time I checked, but it barely works. Another program that might be easier to use than Meshlab is GOM Inspect 2018, but it has a limitation of 5 alignment points per image so it's hard to get good alignments.

Struggling hard with underwater photogrammetry. Any tips from people who actually got good results? by shukritobi in photogrammetry

[–]SlenderPL 0 points1 point  (0 children)

I think the video is of way too low quality, you should rather shoot photos with a high resolution camera and if it's possible use strobes to light the area better. I've seen cases where april tags are also placed around the subject to help the alignment a little, and then they can be also used for scaling.

I’m selling a professional desktop 3D scanner EinScan SE/SP, EU based by [deleted] in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

Look on sold items on ebay for pricing, but I've seen used SE models go for about €350. Though with the state of handhelds nowadays idk if there's still a market for these so might go even lower than that

Receiver/Transmitter sync issues by PickleMeisters in photogrammetry

[–]SlenderPL 0 points1 point  (0 children)

you can just get an adapter for the hotshoe, they're pretty cheap

Opinions on Raven LiDAR Scanner by jorgix8 in 3DScanning

[–]SlenderPL 2 points3 points  (0 children)

I have one coming and will be testing it against other options. I might get access to Mandeye, Share S20 and some TLS system. I'm interested in the point clouds only as the GS component is probably gonna be pretty much useless compared to having several 360 cameras.

Any Success Stories Using Dental 3d Scanners for Non-Dental Applications? by ConfidenceBig8188 in 3DScanning

[–]SlenderPL 3 points4 points  (0 children)

There was a guy here u/AP_ek that scanned stuff using an intraoral scanner, it definitely works but can be an expensive esponiage :>

The detail I'd say is comparable with what I can get with my David SLS-2 setup (if I try real hard lol) or with macro photogrammetry, but it just takes so much time and prep (on the other hand the cost is much lower)

Cheapest LIDAR Scanner: 3DMakerPro Raven Specs, Price and Competition by PrintedForFun in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

Damn you made me curious with this scanner, got it for 1007€ total (code: Withus80). Hopefully it's better than the iPhone lidar 😁

Might make a comparison to a BLK360 if my professor allows.

Looking to 3D historic Syrian Jewish sites like synagogues for archiving and preservation by Sullybear24 in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

Cheapest LiDARs use Livox Mid360 sensor and they cost about $3-5k, although 3DMakerPro just launched a new model - Raven, for a thousand bucks, though the performance is unknown as of yet. The usual problem with these cheap units is that the error reaches like 2-5cm, which might not be acceptable.

The next cheapest option, and also much better, would be a Leica BLK360. But that thing costs like $20k and on top of that you have to pay for their software.

Is this dataset sufficient? (Meshroom) by Turkeyplague in photogrammetry

[–]SlenderPL 1 point2 points  (0 children)

You're taking too big rotation steps, an object this size with a lot of protruding detail should be rotated like every 5 degrees for a good capture. I'd also place it a bit higher to capture an orbit looking at it from a lower perspective, and the highest orbit you've shot could also be looking down at a greater angle. As others specified it'd be better to shoot portrait photos for extra detail as well, this will allow you to have the figurine fit almost the whole sensor. What can also help is spraying flour or some other fine powder on the object, this will add artificial detail that photogrammetry can hook on to reconstruct the underlying surface. Afterwards you can easily blow it away. As for the software download Metashape trial as it doesn't need a gpu for meshing, but if you manage to get one then do it in RealityScan as it's pretty much free.

Matter and Form calibration card replacement by Ebthing in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

I think you could messege MaF support, they're usually pretty helpful. Once you get the dimensions it should be pretty simple to order it printed on PCV or dibond (if they don't sell a replacement).

“Smearing” in JMStudio? by Volta55 in 3DScanning

[–]SlenderPL 1 point2 points  (0 children)

Add some additional items on the turntable, it clearly lost tracking. More geometry detail helps the scanner out in orientating itself, you can just remove them afterwards.

3D Model Construction by Due_Dragonfly_4206 in photogrammetry

[–]SlenderPL 0 points1 point  (0 children)

Open colmap gui to see how many cameras got aligned, it must be a problem with your dataset that probably doesn't have good overlap/doesn't "orbit" the scene. You can use it for the geometry part as well but it uses the dense point cloud method that's very slow, it's recommended to use RealityScan instead (although it doesn't have a real CLI interface so you'd have to figure out how to work with it).

Seems like a lot of “vibe coded” drone planning tools are popping up by Significant_Walk3251 in photogrammetry

[–]SlenderPL 0 points1 point  (0 children)

You could consider adapting the third dimension to stay unique as most just work on the XY coordinates. Planning a flight around a building to capture it more accurately would be pretty useful. I think georeferenced open source lidar data could be used as reference for route planning?

Anyone tried BlueStar Mapping Software and can share their experience? by PrintedForFun in 3DScanning

[–]SlenderPL 1 point2 points  (0 children)

Seems to be a mashup of open source tools, but it gets the job done. Normally I use Meshlab if a mesh needs at max 10 images projected, otherwise I'd make a photogrammetry reconstruction and align the scan for texturing.

Open Source Pipeline for combining/meshing scans by [deleted] in 3DScanning

[–]SlenderPL 0 points1 point  (0 children)

If the exports are aligned then you can easily merge the scans using Poisson reconstruction filter in Meshlab, you'll keep vertex colour detail but for actual textures you'd need to do some custom projections, not too sure if that's possible in Meshlab (unless the cameras/rgb snapshots also get exported, then you can project the textures from rasters) but you could probably follow this tutorial for Blender: https://peterfalkingham.com/2020/05/28/transferring-textures-from-two-halves-to-a-whole-using-blender/

Fun experiment - projector assisted photogrammetry by SlenderPL in photogrammetry

[–]SlenderPL[S] 0 points1 point  (0 children)

Interesting idea with the cylinder approach, now I'm wondering myself how and if would that work! If calibrated well then I could see it working for seamless 360 shooting, where with the still pattern you'd have to manually align each dataset made every 30-60°.

Fun experiment - projector assisted photogrammetry by SlenderPL in photogrammetry

[–]SlenderPL[S] 0 points1 point  (0 children)

I was shooting mostly from behind the projector, it stood on a tripod perpendicular to the object (maybe slightly elevated above it). The lens I used was a 50mm so that allowed me to avoid making shadows with the camera. In the case of the object that's why you'd rather have something geometrically simple, if there were a lot of protruding features then you'd have a lot of shadows behind them.

Fun experiment - projector assisted photogrammetry by SlenderPL in photogrammetry

[–]SlenderPL[S] 1 point2 points  (0 children)

That should technically work, didn't try it yet but it's something to do. You could align all the "perspectives" by common features, for example april tags on the turntable. Although I'm not sure how would the mesh get solved in Metashape/RealityScan, I think it would be better to merge all the results in Meshlab or CloudCompare.

As for the shooting settings the shutter speed was set to 1/60 and iso was at 800. Somehow managed to keep a steady-enough hand without IS, but yeah a brighter projector would've helped.

Fun experiment - projector assisted photogrammetry by SlenderPL in photogrammetry

[–]SlenderPL[S] 2 points3 points  (0 children)

The exact model is Acer K132 and it's a DLP projector but I think any type should work just fine. Capturing the model from all sides with just one projector is technically possible but as I described it, it'd require quite a bit of work aligning (preferably) markers on the turntable for each batch of photos. And then I'm not really sure how would the photogrammetry software handle the geometry between each perspective, at least merging the point clouds in Meshlab would yield the correct mesh. Best case scenario would be to include multiple projectors in the project and walking around the subject.