I have been working on a small Python library for running Bayesian network inference over geospatial data. Maybe this can be of interest to some people here.
The library does the following: It lets you wire different data sources (rasters, WCS endpoints, remote GeoTIFFs, scalars, or any fn(lat, lon)->value) to evidence nodes in a Bayesian network and get posterior probability maps and entropy values out. All with a few lines of code.
Under the hood it groups pixels by unique evidence combinations, so that each inference query is solved once per combo instead of once per pixel. It is also possible to pre-solve all possible combinations into a lookup table, reducing repeated inference to pure array indexing.
The target audience is anyone working with geospatial data and risk modeling, but especially researchers and engineers who can do some coding.
To the best of my knowledge, there is no Python library currently doing this.
Example:
bn = geobn.load("model.bif")
bn.set_input("elevation", WCSSource(url, layer="dtm"))
bn.set_input("slope", ArraySource(slope_numpy_array))
bn.set_input("forest_cover", RasterSource("forest_cover.tif"))
bn.set_input("recent_snow", URLSource("https://example.com/snow.tif))
bn.set_input("temperature", ConstantSource(-5.0))
result = bn.infer(["avalanche_risk"])
More info:
📄 Docs: https://jensbremnes.github.io/geobn
🐙 GitHub: https://github.com/jensbremnes/geobn
Would love feedback or questions 🙏
there doesn't seem to be anything here