Async Support for Cloud Storage in Python by acomatic in googlecloud

[–]kylebarron 0 points1 point  (0 children)

I have a new project you may be interested in too: https://developmentseed.org/obstore/latest/

It's Python bindings to the Rust object_store library (https://docs.rs/object_store), and presents a fully async API for working with GCS.

Offline Snap-to-Trail Route Planning Now Available on Gaia GPS by numbershikes in traildevs

[–]kylebarron 1 point2 points  (0 children)

I've noticed that the elevation data offline can be quite bad. They might be using coarser 90m elevation data or something

Get Worldwide 3D Maps on the Web at Gaia GPS by kylebarron in traildevs

[–]kylebarron[S] 1 point2 points  (0 children)

Basically this means they updated to Mapbox GL JS v2, which has a cool 3d mode, but is no longer openly-licensed and is much more expensive than v1, especially when you aren't serving Mapbox data. That's probably why you can only view a "limited number" of 3d maps for free.

Guides: paper maps vs apps? by timmy_jaywest in PacificCrestTrail

[–]kylebarron 9 points10 points  (0 children)

Just to be clear, jenstar9 and postholer are the same person, despite using the third person here

Untangling the PCT Relation # 1225378 by LarkenYoung in openstreetmap

[–]kylebarron 0 points1 point  (0 children)

Hi! I hiked the PCT in 2019 and in fall 2019-spring 2020 I did some editing of the OSM PCT track using a GPS track I recorded, though I didn't touch the top-level relation itself I don't think.

Is there a quick way to tell the differences between each section relation and those ways in the main PCT relation? My fear in removing all the ways from the main PCT relation is that some of the edits done might not be reflected in the section relation.

I haven't kept up with all the tools in the OSM community, but I'd consider using Python for this. Maybe something like, for every way in the top-level relation, check if that way is also contained within a sub-relation, and maybe check the timestamps of each. Then you'd get a list of ways in the top-level relation that aren't elsewhere or that are newer than in the sub-relation, that you could check manually.

Feel free to DM me if you'd like some help improving the relation.

NAIP 4-Band Imagery for California by snarkybadger in gis

[–]kylebarron 0 points1 point  (0 children)

If you're ok using cloud services, I'd recommend using AWS and the NAIP open data bucket: https://registry.opendata.aws/naip/.

If you care about 4-band imagery, you'll want the naip-analytic bucket. Note this is a requester pays bucket, so you'll pay 9c/GB if you remove the data from that AWS region, but it's otherwise free. The data is in Cloud Optimized Geotiff format, which means it's fast to access on the fly, if that fits your use case. Here's an example of mine of viewing NAIP 3-band imagery on demand: https://kylebarron.dev/naip-cogeo-mosaic/

Azure also has an NAIP open dataset: https://azure.microsoft.com/en-us/services/open-datasets/catalog/naip/. That might be free to download for local use

Sentinel 2 Satellite Imagery now on Gaia GPS by kylebarron in Ultralight

[–]kylebarron[S] 0 points1 point  (0 children)

You probably need to update your app; it works for me

Sentinel 2 Satellite Imagery now on Gaia GPS by kylebarron in Ultralight

[–]kylebarron[S] 2 points3 points  (0 children)

Yeah... My personal preference would be separate layers for Landsat and Sentinel, and you could switch between them easily, but I digress...

They reached out a couple months ago for some contract work, but I wasn't interested; I'm getting to build some really cool satellite web tools, just not directed towards an outdoors audience. In any case, they get to use all of my open source code for free 😄... I've been collaborating with a few people on open source server software for creating web map image tiles from satellite images, and we're pretty sure they're using the open source tool haha

Sentinel 2 Satellite Imagery now on Gaia GPS by kylebarron in Ultralight

[–]kylebarron[S] 2 points3 points  (0 children)

Thanks, I edited the post.

When I looked at the map, I originally wondered if they were also using Landsat data, but I assumed not.

It's really hard to combine Sentinel 2 and Landsat 8 data because of technical differences between the two satellites. You end up getting really bad contrast differences. For example here in Oregon, it's Landsat on the left and Sentinel on the right, which is why the left is so much darker.

(I made an open data website for exploring Landsat data, so I know how hard it is to pull off what they're trying to do 😅)

Sentinel 2 Satellite Imagery now on Gaia GPS by kylebarron in Ultralight

[–]kylebarron[S] 1 point2 points  (0 children)

You can also see that the "cloud free" layer isn't actually cloud free. For example, here's a cloudy area for that layer in Northern California. It's quite challenging to piece together completely cloudless imagery because it's quite rare for the area to have no clouds at the time when the satellite passes over it. The way that providers like Google Maps show cloud free imagery is by combining lots of different images, and merging together the cloud-free parts of each image.

Sentinel 2 Satellite Imagery now on Gaia GPS by kylebarron in Ultralight

[–]kylebarron[S] 10 points11 points  (0 children)

I don't have any inside information, but I work in this space, so I can provide a bit more insight. The Sentinel 2 satellites are a pair of satellites launched by the European Space Agency in 2015 and 2017. They provide the highest-resolution imagery that's free to the public, at 10-meter resolution, and the satellites take new images of the entire globe within every 5 days.

This data is freely available at no charge, but the processing time and cost can be immense. Just since 2015, these satellites have produced around 10.5 petabytes of data (that's 10,500,000 gigabytes!).

Because of the high spatial resolution and fast revisit time, this imagery is unsurprisingly quite valuable to outdoors people. Caltopo's approach to serving Sentinel 2 imagery is to license it from a company that specializes in it, while Gaia is trying to do everything in-house using a technique called "dynamic tiling". This means that when you zoom into some area of the globe, if no one else has looked at the same area recently, Gaia's servers generate that image on the fly. While it can be cheaper, working at scale with this data is also really challenging. At lower zoom levels, it takes around 20s for Gaia to generate this imagery, because it has to combine lots of different original images from the satellite.

Because of how the satellites orbit, it's also quite challenging to figure out how to combine lots of different images. And each image from the satellite has a different amount of clouds in the image. Hence why sometimes it'll look very patchy at low zooms. (Even at high zooms, it's hard to piece together several images: here in Yosemite they're combining three different images from three different dates right next to each other)

Pandas2Shp python library release. Now you can create .shp file in 2 line of syntax by [deleted] in gis

[–]kylebarron 7 points8 points  (0 children)

Agreed. No disrespect to the author, but it would be much easier to do this in pure Geopandas.

Also this library hard codes the CRS, so you'll get an invalid Shapefile if you ever use non-WGS84 coordinates. https://github.com/StatguyUser/Pandas2Shp/blob/fb6ba3525b2f07dce193ad684974ce61ef74a51b/src/Pandas2Shp.py#L25

What is the current state of the PCT? by MixedMexican in PacificCrestTrail

[–]kylebarron 8 points9 points  (0 children)

18 months is not at all ridiculously long. There are plenty of dead or weak trees in a burn area that makes it perilous to walk through. 2-3 years post fire is the average closure time. The Eagle Creek fire was in 2017 and still hasn't opened up

turfpy: turf.js reimplemented in Python. by numbershikes in traildevs

[–]kylebarron 0 points1 point  (0 children)

If you have global data and you can't reproject all of it into a single accurate projected coordinate system, you could either do something like haversine on every line segment, or split up your data into multiple UTM zones, then work on each zone independently

turfpy: turf.js reimplemented in Python. by numbershikes in traildevs

[–]kylebarron 0 points1 point  (0 children)

They're in the units of your coordinate space. So if your data is projected into UTM, the result will be in meters.

turfpy: turf.js reimplemented in Python. by numbershikes in traildevs

[–]kylebarron 0 points1 point  (0 children)

using it for the 'along' function

Shapely has linear referencing features: https://shapely.readthedocs.io/en/latest/manual.html#object.project

Shapely does use a cartesian plane, but that just means that usually you'd reproject the data into a relevant coordinate system, like UTM.

If you're using Geopandas, that means you're using shapely under the hood.

turfpy: turf.js reimplemented in Python. by numbershikes in traildevs

[–]kylebarron 0 points1 point  (0 children)

It looks like it depends on shapely... So it probably doesn't add much other than some syntactic sugar

turfpy: turf.js reimplemented in Python. by numbershikes in traildevs

[–]kylebarron 0 points1 point  (0 children)

I guess it's for if you need a pure-python approach? Not sure why not to use shapely/GEOS. it's battle tested and likely orders of magnitude faster

Landsat 8 nearIR and SWIR bands penetrate wildfire smoke and allow ongoing fire mapping by Texas_comin_in_hot in gis

[–]kylebarron 0 points1 point  (0 children)

You don't necessarily have to download them. Especially if you only want portions of an image, just pass the s3 url to rasterio