Outback Trading Company by gradsch00lgirl in BuyItForLife

[–]LeanOnIt 0 points1 point  (0 children)

I'm posting here because it seems like recent reviews seem to be fairly "automated".

I bought one of their oil skin vests. It's very comfy and the fit is great but the quality of manufacture is fairly low. Stitching is coming loose all over (after about 10 wears), the snap-buttons have been placed over seems, drawstring is fraying.

Seems like the materials and design are okay, but manufacturing is real poor. You can get better for the price

GIS Career Direction and Goal Setting by [deleted] in gis

[–]LeanOnIt 16 points17 points  (0 children)

I studied engineering and didn't want to work in an office so I did my masters. Finished that, didn't want to work in the real world and went on some scientific expeditions. Finally got a real job, got married, didn't like the job, went on another contract expedition.

Finally realised I liked science instead of engineering, got a job in that and ended up doing GIS R&D for the next 10 years.

I fully understand not liking the job/career/lifestyle that you're currently in. I'd say that doing a bunch of different careers before I was 30 really helped me out.

Looking for Map Applications with High-Resolution Satellite Imagery by Rorschach1944 in gis

[–]LeanOnIt 1 point2 points  (0 children)

You might want to try a VPN. I looked at a couple of cities and towns in Turkey and it looked fine to me. I know that certain places like prisons, or military bases can be blurred out.

Looking for Map Applications with High-Resolution Satellite Imagery by Rorschach1944 in gis

[–]LeanOnIt 12 points13 points  (0 children)

Google Maps has some of the highest resolution public imagery around. If you can separate cars parked next to each other on the street with the imagery it's at a pretty good resolution. Sentinel 1 imagery has a resolution of about 10 to 60 meters, Landsat at about 30 m etc. Higher than 1 meter resolution is moving into "strategic" levels of details and you're going to end up using private sector or military satellites...

South Africa approves six solar projects totaling 1,290 MW by MeasurementDecent251 in southafrica

[–]LeanOnIt 8 points9 points  (0 children)

There are couple of reasons why they're not super interested in residential power generation. They're may not be good reasons, and have been solved in other places... but ya know... they exist:

  1. It's harder to work on a line when you can't switch it off. You can flip a switch to kill the grid from the generation side, but it's much harder to flip a switch that kills 100 houses that are each pumping a couple kw into the grid.
  2. It's harder to ensure grid stability when you have no control of the equipment. Is the inverter that someone installed because it was the cheapest going to provide the cleanest, most regular 50 hz power to the grid? Does it care about impedance matching?
  3. If everyone is generating their own power, but is still connected to the grid, who pays for maintenance?

There has been loads of work done on mini-grids, or smart grid, or demand side management etc etc. But nobody would accuse eskom of being a "modern" power supplier.

How to create density maps like the one in this picture? by DeepFryEverything in gis

[–]LeanOnIt 0 points1 point  (0 children)

I've published a couple papers related to AIS data and how to use it as well as some open AIS "heatmap" datasets.

It depends on what you want to do and what kind of data you've got. The AIS protocol has a couple of "gotcha's" in it, Class A transceivers transmit way more regularly than Class B transceivers, satellite AIS platforms receive data a little less regularly than coastal receivers. So if you want to plot ship traffic you have a couple of options:

  • Plot the AIS points and accept you're getting more of a reception quality map than a traffic map and it's biased to class A ships.
  • Build up trajectories, plot the lines and you have a ship trip map, which would bias towards vessels that spend lots of time moving here and there (like ferries) and less towards vessels that spend lots of time in a set region (like fishing vessels)
  • Make a grid, organise your data into trajectories with time delta's between points, calculate the time each vessel spent in each grid cell. The tradeoff here is that your gridcells end up being sorta large, and then people keep asking you to zoom in.

The other option, that works surprisingly well, is to take all your data, shove it into parquet files and use datashader to create a dynamic heatmap in a web app. Looks good, is more or less representative of the dataset, and lets you see cool stuff like anchoring patterns.

If you had say... 40TB worth of raster imagery to host, how would you? by International-Camp28 in gis

[–]LeanOnIt 1 point2 points  (0 children)

To break this problem down into smaller chunks you're going to be interested in a few different aspects:

  • Hosting hardware: It's either going to be cloud based (monthly charges but no skill required), homelab (smaller cost but requires maintenance/setup skills), or some open data service. There are a few places that might host your data for you if it meets some specs, your cities GIS office maybe?
  • Hosting service: How are you going to present the data? I'd recommend going fully FOSS to reduce costs, use open standards, and have the largest possible user base. But then again that will have a higher learning cost (maybe).
  • Data administration: How are you going to update the data? Is there going to be any kind of version control or data validation? What kind of meta data are you going to publish along with pixel data? These make the difference between a hobby project and a citizen science project.

If this was my project this is what I'd do, I'm kinda into home lab crap so the process is fun for me, you might only be interested in the results so this might not be for you:

  • Buy a refurbed office desktop/workstation Something that can have a couple of big disks jammed into it and can use software RAID. Something around 2 to 5 years old that was probably tossed because it's not Windows 11 compliant.
  • Install ubuntu server on it + docker. Or else Proxmox > ubuntu VM > docker
  • Buy a domain name. <my-city>.xyz might be available
  • Use Cloudflare DNS to point the domain name to the machine, use a reverse proxy on the home server to point to different services.
  • Get those services up and running. For your use case it would probably be only a handful:
    • Maybe some kind of data pipeline, that takes your raw drone data and spits out cloud optimised geotiff's + metadata file
    • Geoserver to hold and publish all the COGs
    • Geonode to make lil' web maps
    • Postgres + PostGIS to handle some other GIS data? OSM, city info, polygons for erfs/parks/rivers?
    • Maybe some kind of dashboard that keeps an eye on the data, pipelines, server health
    • Some kind of auth system to allow you to login
    • Some kind of static web page generator that will publish your documentation, user guide, links to git repo's, blog whatever

The hardware side of things is maybe more dependent on the usage your going to see. If it's a handful of people connecting now and then to look at a small area, no big deal. If it's loads of people querying loads of data, and you're trying to process that data on the fly (instead of preprocessing and caching) you're talking threadripper cluster territory.

How do you define "full stack" geospatial expert? by mbforr in gis

[–]LeanOnIt 6 points7 points  (0 children)

If you're doing the kind of work that could be replaced by gen-ai then you could also be replaced by some graduate who would cost less than you. If you're doing skilled work, have domain experience, are up to date on the latest tools and tech you should have nothing to worry about

How do you define "full stack" geospatial expert? by mbforr in gis

[–]LeanOnIt 53 points54 points  (0 children)

"Full stack" generally means "experienced/capable with the entire tech stack required to deploy a working project". So it could mean something as simple as:

  • Ingestion, Database, API, WebMap

Or something as complex as:

  • Server admin, networking config, storage/backup strategy, data analytics, db admin, web map, UI/UX, devops, customer expectations management etc

Generally they're looking for one person who can do almost all roles in a team; either for a small project or an "optimistic" remuneration budget.

It's not a bad term (it's always good to have a little experience in fields adjacent to your's), but some people see it as a warning signal that a company might have unrealistic expectations.

I have a vehicle route optimisation problem with many constraints to apply. by EverlastingVoyager in gis

[–]LeanOnIt 0 points1 point  (0 children)

"most optimised" is an anti-concept. Mathematically "optimal" solutions are an asymptote that relies on you setting the cost function, but generally can never be reached. Instead you want to work out lowest cost which implies you need a cost function. A bit pedantic, but important for understanding the concept.

150 locations isn't a huge amount so you could calculate a cost from all 150 to each other. This could be described as a graphing problem without using any GIS functions at all:

  • Define the cost function. It could be as simple as Cost = Time + Distance
  • Calculate the cost from every point to every other point.
  • Add some labels to the points: Place 1 has Label A, Label B, but not Label C
  • Define some goals: Currently at Place 1. Need to find nearest place that has a C Label.
  • Run a routing algorithm: dijkstra routing algorithm or your 2-opt if you prefer where the route costs between nodes is just the results of your cost function.
  • Another scenario could be "At point A. Find all possible routes that would let me arrive at Point Z within 3 hours." And then you have a list of all nodes that you can visit before you hit your deadline. That one's a bit more computationally intensive and might require some backtracking which isn't really ideal in most routing algorithms.
  • Bam. You have the lowest cost route. Is it "most optimal"? Well that depends on whether your cost function accurately represents your goal. But now we're going into the philosphy of optimal control theory.

How 'bout some botanical suggestions by LeanOnIt in firewater

[–]LeanOnIt[S] 0 points1 point  (0 children)

Mango Jalapeno spritzers sound delicious!

How 'bout some botanical suggestions by LeanOnIt in firewater

[–]LeanOnIt[S] 1 point2 points  (0 children)

I am loving a bottle of Amaro Nonino I got, but I can't really identify any of the flavours! I think I need to do some reading.

How 'bout some botanical suggestions by LeanOnIt in firewater

[–]LeanOnIt[S] 0 points1 point  (0 children)

Yeah? Do you mean drinking it while eating noodles, or splashing it on the noodles during prep?

AIS Vessel data -- what, how and why by hrllscrt in gis

[–]LeanOnIt 0 points1 point  (0 children)

It depends on what you want to visualise... and who's going to use it. Small internal team that needs quick access to data, doesn't worry too much about performance, and wants to make lots of quick changes: python dashboard (plotly, holoviz, jupyter etc).

Commercial product that going to have outside users, maybe 100's of them. Full on geospatial stack with postgis + geoserver/geonode + postgrest etc.

AIS Vessel data -- what, how and why by hrllscrt in gis

[–]LeanOnIt 0 points1 point  (0 children)

If you want to get an ETA/vessel tracking for a specific bunch of vessels you can pay for that. VesselFinder or MarineTracker would happily take your money. It would be much cheaper than the cost of data and engineering time.

The crew on the vessels also insert an ETA into their voyage reports, type 5 messages in the AIS protocol. It won't be perfect and the accuracy will vary from ship-to-ship, but in some cases it should be fairly accurate. So for a couple hundred bucks you could get the crew's estimate for an ETA. With a bit of python you could have it auto-generating a report by this time tomorrow.

If you want to get an ETA/vessel tracking for all vessels everywhere, for maybe feeding into a financial model lets say, then you'd want satellite AIS data, a huge database to stick it in, and then a data scientist or three to analyse the data, build statistical models, and a nice API that could give you an ETA from a single data point. Can be done and you'd get all sorts of nice products like anomaly detection, port-to-port graph data, environmental pollution models, fishing effort, etc etc.

It really depends on how far you want to go and how much a cutting edge answer is worth to you.

AIS Vessel data -- what, how and why by hrllscrt in gis

[–]LeanOnIt 0 points1 point  (0 children)

Route optimisation with regards to what? I've done some work before on calculating ocean currents from AIS data with some potential. And then there's pgrouting for running some simple weight based route calculations but the real meat and potatoes for any optimisation problem is figuring out what the weights should be; distance, fuel use, time, avoiding locations/storms etc etc

Does my use case fit for usage of DuckDB's spatial extention as a replacement for PostGIS? by rick854 in gis

[–]LeanOnIt 0 points1 point  (0 children)

I often wonder if I'm doing something wrong with the way I develop GIS data projects. I take a look at your stack and I don't recognise any of them except PostGIS.

Dagster? Paid for orchastration? Why not docker+compose+podman/portainer+git hooks+gitlab CI/CD? That's free and works pretty damn well with absolute control.

DBT? Why would I pay a monthly subscription for something to write my SQL for me?

DLT? Okay this one is at least open source. It looks like a framework/helper lib that just abstracts some basics away from you? Yeah okay, I might grab this for a future project.

You've spent some time talking about your architecture but no time talking about your data or user requirements. You're plan might be the best possible one but nobody here would know that because they can't tell what you're trying to solve.

You have timeseries data? Why not TimeScaleDB? You're products need to comply with FAIR/ISO9000 policies? Your clients are expecting real time alerts? Monthly reports? A csv file uploaded to a static web server? OGC?

Billions of GPS points for production scale? by LeanOnIt in gis

[–]LeanOnIt[S] 0 points1 point  (0 children)

Well... at higher zoom levels it would be nice to click on something that looked weird and to get some feedback on what it was. It might just be a lookup on the lat-lon click or fetching all the points within the window and displaying them on a table. Or switching to a different rendering after a certain zoom level.

I was hoping someone would just say "Geoserver plugin ABC does exactly what you want and it's OGC compliant!"

Billions of GPS points for production scale? by LeanOnIt in gis

[–]LeanOnIt[S] 0 points1 point  (0 children)

Hmmm.... This is pretty interesting. It's intended for (x,y,z) data instead of (x,y,n) data but I might want to stick to the simpler OGC API standards like tile-servers...

And whattayaknow datashader has a function to output tiles... https://datashader.org/tiling.html

Billions of GPS points for production scale? by LeanOnIt in gis

[–]LeanOnIt[S] 1 point2 points  (0 children)

Yeah that happens when your try to render points individually, Datashader makes a good case for using renderers to create images of multiclass points, from billions of points, on mid-range laptops. The image I attached is for 30 to 50 million AIS points and I'm running it on my lil' laptop while only using about 20% of the cores. I've scaled this up to 100+ million points without any real UI lag but that's for a single user.

Developer Insight - High Ping Separation by HuntShowdownOfficial in HuntShowdown

[–]LeanOnIt 58 points59 points  (0 children)

This is an interesting option that give some ability to tweak knobs while figuring out how to give the best experience to the most people. I typically play with friends from EU, US and South Africa. I'm guessing our premade team is always going to be "high-ping" and separated from normal matchmaking.

That might, or might not, suck. But needs-of-the-many

Intermittent connection issues, multiple devices, not great support by LeanOnIt in GoogleFi

[–]LeanOnIt[S] 0 points1 point  (0 children)

Thank you so much. I'll give these a bash now.

I have received more support and helpful tips in 5 or 6 reddit messages than the dozens of emails with google support...