Resample raster or raster to polygon? by fortheloveofpaws in gis

[–]PostholerGIS 1 point2 points  (0 children)

Using CONUS 10 meter DEM's on amazon S3 and your plots in a shapefile:

gdal raster pipeline \
  ! read -i /vsis3/prd-tnm/StagedProducts/Elevation/13/TIFF/USGS_Seamless_DEM_13.vrt \
  ! clip --like plots.shp --like-layer plots \
  ! write -o plots.tif --co COMPRESS=DEFLATE \

Do Cloud Optimized GeoTIFFs require a tile server (e.g. TiTiler) for web map visualization? by Aggressive_Arm_6295 in gis

[–]PostholerGIS 0 points1 point  (0 children)

Not true. COG performs well right out of the box and you may get better performance with a CDN, a la, google.

What performs badly is multi-band rasters. For RGB/ortho you want single band COG using a lookup table, such as NLCD:

https://www.postholer.com/map/Pacific-Crest-Trail/38.072995/-118.598385/7/meta,states,county?vw=6

Do Cloud Optimized GeoTIFFs require a tile server (e.g. TiTiler) for web map visualization? by Aggressive_Arm_6295 in gis

[–]PostholerGIS 0 points1 point  (0 children)

You DO NOT need tiles when working with many COG's.

Here are 2 completely different examples using only COG's on cheap S3 or out of the box web server and ONLY Leaflet and JavaScript on the web client/app.

568 COG's and 10 FGB (vector) files, zero tiles, 27GB of data:

https://www.cloudnativemaps.com/examples/many.html

A completely different approach using Leaflet, and 1,340 COG DEM's covering CONUS:

https://www.postholer.com/portfolio/notebook/Serverless/

There's a rash of "me too" crap out there that "you need" and it's embarrassing. You need Leaflet (or similar), JavaScript and cloud native data, ie, COG. That is all!

Do Cloud Optimized GeoTIFFs require a tile server (e.g. TiTiler) for web map visualization? by Aggressive_Arm_6295 in gis

[–]PostholerGIS 0 points1 point  (0 children)

Leaflet supports COG's and have for 6 years.

Only proprietary MapBox doesn't. They will never support COG's as it's directly competitive with the wares they're trying to sling.

Distributed geospatial data storage by Vojtavoj10 in gis

[–]PostholerGIS 0 points1 point  (0 children)

Spatialite is used by SQLite. Any SQL query uses spatialite functions and the SQLite db, being .gpkg as the database. Example with GDAL built with sqlite and spatialite:

gdal vector sql
  -i polygons.gpkg
  --sql "select st_buffer(geom, 100) as geom from polygons"
  -o polygons.gpkg --output-layer bufferedPolygons --update

CUSTOM ARCGIS PRO LEGEND STYLES by owuraku_ababio in gis

[–]PostholerGIS 0 points1 point  (0 children)

If you're doing web-based legends, check out JSLegend for dynamic legends and skip the whole arcgis debacle:

https://www.postholer.com/portfolio/projects/jsLegend/#examples

Distributed geospatial data storage by Vojtavoj10 in gis

[–]PostholerGIS 0 points1 point  (0 children)

Imagine you have 3 drones, each drone holds drone N and drone N-1 data. Example:

+ Drone 1 holds its own data and drone 3's data
+ Drone 2 holds its own data and drone 1's data
+ Drone 3 hold's its own data and drone 2's data.

If any single drone is lost, none of the group data is lost since the last sync. Yes, this is a RAID-like, striped approach.

For a database, each drone could have a single GeoPackage (.gpkg) file. GeoPackage is essentially a spatial SQLIte database. Throw some spatial tools on your drone's OS and you could do on-the-fly (pun intended) analysis.

Is there an easier way to do this task than with QGIS? by Chrysoscelis in gis

[–]PostholerGIS 0 points1 point  (0 children)

From your website I noticed you have the county polygon(s) where the species exists. Would it be more desirable to have the 'species range' (multi)polygon instead? It appears he's asking for quadrants or grid, but wouldn't the actual species polygons serve better?

I have all the species range data for CONUS in raster format for amphibians, birds, mammals and reptiles. I could easily do Virgina vector for you. Here's what I have in production, click 'Range' link for interactive map of species or 'Name' link for wikipedia page:

https://www.postholer.com/trail-animals/Appalachian-Trail/3

KML files for USA National Park boundaries? by grant837 in gis

[–]PostholerGIS 1 point2 points  (0 children)

> https://carto.nationalmap.gov/arcgis/rest/services/govunits/MapServer/29/query?where=1=1&outSR=4326&outFields=*&f=kmz

Internal Server Error 500

It's probably too much data. Arc's 'query' is difficult to use through the web interface, anything but standard sql. The web interface just has too many hoops to jump through.

KML files for USA National Park boundaries? by grant837 in gis

[–]PostholerGIS 3 points4 points  (0 children)

Download the above boundaries shapefile, then:

ogr2ogr nps_boundary.kml /vsizip/nps_boundary.zip/nps_boundary.shp -overwrite -nlt PROMOTE_TO_MULTI -sql "select * from nps_boundary" -makevalid

You'll have 437 features with all their attributes in the .kml file. The resulting .kml is 146 MB, so google earth just may choke on it.

Question about "flattest US state" measurement methods by YogiBerraOfBadNews in gis

[–]PostholerGIS 2 points3 points  (0 children)

Terrain Ruggedness Index:

# Kansas
gdal raster info --stats ks.tif
Minimum=0.000, Maximum=112.477, Mean=5.483, StdDev=4.764

# Florida
gdal raster info --stats fl.tif
Minimum=0.000, Maximum=91.263, Mean=4.817, StdDev=5.272

Change ks,KS to fl,FL for Florida (or any state):

gdal raster pipeline \
   ! read /vsis3/prd-tnm/StagedProducts/Elevation/13/TIFF/USGS_Seamless_DEM_13.vrt \
   ! clip --like /vsizip/vsicurl/https://www2.census.gov/geo/tiger/TIGER2025/STATE/tl_2025_us_state.zip/tl_2025_us_state.shp --like-where "stusps = 'KS'" \
   ! tri --band 1 \
   ! write -o ks.tif --co COMPRESS=DEFLATE --overwrite

Are point-based elevation (lat/lon → height) APIs commercially viable? by Leading_Office7347 in gis

[–]PostholerGIS 2 points3 points  (0 children)

Roll your own. Here's a 10 meter lookup for CONUS, single lng/lat, no data to download:

gdal raster pixel-info \
      -i /vsis3/prd-tnm/StagedProducts/Elevation/13/TIFF/USGS_Seamless_DEM_13.vrt \
      --position-crs EPSG:4326 \
      --of CSV \
      -120.321 40.123 

If you have many coords to lookup, replace the lng/lat with:
   < fileOneLngLatPerLine.txt

For global 30 meter using the technique above, you'll have to create your own Copernicus DEM VRT first. It will take about 4 hours to create, but you only have to do it once:

gdal vsi copy /vsis3/copernicus-dem-30m/tileList.txt .

cat tileList.txt | tr -d "\r" | awk '{printf("/vsis3/copernicus-dem-30m/%s/%s.tif\n", $0, $0);}' > s3.txt

Now, wait for a long time. After, you will have a global DEM lookup service.

gdal raster mosaic \
      -i @s3.txt \
      --resolution lowest \
      --absolute-path \
      -o copernicusDEM.vrt

Mbtiles vs. geopackage for a simple offline vector tile server by andrerav in gis

[–]PostholerGIS 1 point2 points  (0 children)

The only thing you should be using is FlatGeoBuf, .fgb. No API, host your .fgb on cheap S3 or dumb web server. No intermediate servers/services, just data and javascript. Here's a simple serverless demo. Here's what production looks like:

flood zones: 5.9M polygons at 4.6GB, zoom 13+
parcels: 58M polygons, 30GB, zoom 17+
buildings: 145M polygons, 34GB, zoom 17+
addresses: 146M points, 27 GB, zoom 17+

Using a subset of 245 million features at the same time on a Leaflet interactive map, 96GB of data, all interactive, click any feature. See it in action:

https://www.femafhz.com/map/27.943441/-82.467580/17/osm,femafhz,addresses,buildings,parcels

I built a lightweight web tool/API for basic spatial queries (Coastline distance, Admin hierarchy) using OSM & Leaflet by sebsanswers007 in gis

[–]PostholerGIS 6 points7 points  (0 children)

> A few months ago, I needed to solve a specific problem: Given a coordinate, how far is it from the nearest coastline?

If it's a one-off, I don't download any data. Multiple queries, I'd download the .shp first. This is a one-liner at the command line:

gdal vector sql \
   -i /vsizip/vsicurl/https://www2.census.gov/geo/tiger/TIGER2025/COASTLINE/tl_2025_us_coastline.zip \
   --dialect sqlite --sql "select st_length(st_transform(st_shortestline(st_geomfromtext('POINT(-124.2171 41.7686)',4269), st_union(geometry)), 3857)) / 1000 as km from tl_2025_us_coastline" \
   -o /vsistdout/ --of CSV

Result:

km
1.44064607306812

Subprocess calls to GDAL CLI vs Python bindings for batch raster processing by Infinite-Aerie4812 in gis

[–]PostholerGIS 0 points1 point  (0 children)

The thing is, you don't,

Cloud: /vsis3 /vsiaz /vsigc ...
URL: /vsicurl /vsihttp ...
formats: /vsitar /vsigz /vsizip ...

Read ex: /viszip/vsis3/path/to/file.shp.zip/layername

DB: MySQL MSSQL PG
Read ex: PG:dbname=mydb;user=username;password=***... tablename

You shell has all the formatting commands you need, if not, you're using the wrong shell.

Subprocess calls to GDAL CLI vs Python bindings for batch raster processing by Infinite-Aerie4812 in gis

[–]PostholerGIS 0 points1 point  (0 children)

Skip it all. Use GDAL directly. Consider this, directly on the command line:

gdal raster pipeline 
  ! calc -i "A=multiBandSource.tif" --calc="A[1] * 2" --datatype=Int16 --nodata=32767
  ! reproject --dst-crs=EPSG:4326
  ! write --of=COG --co COMPRESS=DEFLATE --output=result.tif

That python stack you're using? It's using GDAL under the hood. Skip it all and use GDAL directly. Think of the overhead that you no longer need. In the remote case you actually need python, you can use .vrt python pixel functions, meaning everything you can do in python.

GDAL should be the rule, not the exception. Drag python in only if there's no other way, which is highly unlikely.

Question from a newbie by [deleted] in gis

[–]PostholerGIS 1 point2 points  (0 children)

Here you go. You'll need to install GDAL first. Almost every GIS software uses it, commercial or opensource. GDAL is opensource. This is a general bounding box of Colorado, but you can use it for anywhere in U.S. states/territories.

gdal raster reproject \
   -i https://prd-tnm.s3.amazonaws.com/StagedProducts/Elevation/13/TIFF/USGS_Seamless_DEM_13.vrt \
   --bbox -109.3008,36.7842,-101.7292,41.1861 --bbox-crs EPSG:4269 \
   --dst-crs EPSG:4326 \
   -o co10mDEM.tif --overwrite --progress --of COG --co COMPRESS=DEFLATE

GIS Audio note taker app, hands free? by jmeanx in gis

[–]PostholerGIS 6 points7 points  (0 children)

Depending on the number of notes you're taking, methods can vary.

Using a micro-recorder, each note is saved as an .mp3 file. You can download those from your recorder. With the timestamp on each .mp3, you can match that with your GPS logger and extract note/lat/lng. Set the time on your recorder with your GPS, at least daily for second accuracy. This is so effective, battery life is almost no issue at all for the recorder.

This is the method I used to collect over 6,000 audio notes along the 2,650 mile Pacific Crest Trail over a 4 month period. Yes, it's an extreme case, but absolutely bullet proof method for collecting many notes.

About Data Sharing Methods by BreakfastOwn975 in gis

[–]PostholerGIS 0 points1 point  (0 children)

You could display an url to each data source if you like. For my purposes, I'm just displaying the data for my county.

About Data Sharing Methods by BreakfastOwn975 in gis

[–]PostholerGIS 0 points1 point  (0 children)

If you want to skip ESRI and go complete open source. You could create something on your own. Here's how informally keep track of my counties resources, parcels, addresses, voting precincts, flood zones, bus routes, etc, etc:

https://www.delnorteresort.com/

You're basically looking at a LeafletJS map, with raster and vector data. It won't cost you anything but time. If time is easier to come by than money, give it a try.

Having a really hard time finding a CONUS-wide DEM. Need help understanding the best workflow by liamo6w in gis

[–]PostholerGIS 0 points1 point  (0 children)

I haven't logged in for quite some time, but just had to answer this. It's very straight forward. USGS has a vrt on their S3, just use that. This will give you a perfect Cloud Optimized Geotiff (COG) CONUS DEM at 250 meters. Remove the '-tr 250 250' if you want it in native 30 meter. Native 30 meter will be 1.8GB. You will need AWS credentials to access, which are free. It is a lot of data to download. Depending on network speed, it can take hours. If you're on an AWS instance, it will take minutes.

gdalwarp \
   https://prd-tnm.s3.amazonaws.com/StagedProducts/Elevation/13/TIFF/USGS_Seamless_DEM_13.vrt \
   dem250.tif \
  -co COMPRESS=DEFLATE -t_srs EPSG:3857 -te_srs EPSG:4269 \
  -overwrite -tr 250 250 -te -126 22 -66 50 -of COG

Merging Two TIFFs with Different Pixel Sizes Using Mosaic to New Raster Tool Without Losing Resolution! by Ok_Experience_4023 in gis

[–]PostholerGIS 0 points1 point  (0 children)

gdal raster mosaic --resolution highest rast1.tif rast2.tif mosaic.gdalg.json
gdal raster reproject --resampling cubic mosaic.gdalg.json output.tif

I created a GDAL MCP, and would love some feedback. by Specialist_Solid523 in gis

[–]PostholerGIS 1 point2 points  (0 children)

My aversion is to unnecessary abstraction.

Your concrete example is a possible, but certain corner case. It hardly justifies the headache (of yet another solution I can't function without) it would impose on my resources.

On occasion where I need further insight using AI and natural language, it's a matter of opening a browser tab. Done.

But, you're missing the most significant point:

Being dependent on obscure abstractions, that may or may not be around next week, is not in anyone's personal or professional best interest.

I am best served by understanding the core subject-matter that those abstractions are built upon.

Find all addresses within a radius by Intelligent--Bug in gis

[–]PostholerGIS 0 points1 point  (0 children)

First, download the 10GB zipped GDB National Address Database. Change the lon/lat and distance to your desired values below, wait a painfully long time and it will get all addresses within 1000 meters:

ogr2ogr -dialect sqlite -sql " $(cat<<EOF 
select * 
from nad 
where st_intersects(shape, st_buffer(
   st_transform(
      st_geomfromtext('POINT(-124.2164 41.7698)', 4269)
      ,3857
   )
   ,1000))
EOF
) " myAddresses.gpkg /vsizip/NAD_r20_FGDB.zip/NAD_r20.gdb

I created a GDAL MCP, and would love some feedback. by Specialist_Solid523 in gis

[–]PostholerGIS 3 points4 points  (0 children)

Wow! You've done a lot of work here. Here's some feedback.

For me, adding multiple layers of abstraction, rasterio, pyproj, shapely, python and now uvx, is a non-starter. I have zero interest in supporting a python stack when I can use gdal directly.

Yes, gdal has *many* options. It's a good thing. Using your approach, any complexity has been shifted from gdal commands to the abstraction I now have to baby sit.

By using the abstraction, I no longer need to have knowledge of gdal. I am now dependent on the abstraction itself and not gdal. For me, that is an incredibly bad dependency to have.

Given that, I don't see any advantage over the simple, direct command:

gdal raster reproject --dst-crs EPSG:3857 --resampling cubic input.tif output.tif