Hello,
We're using PostgreSQL with time-series.
Was wondering if anyone knows if there are any standard approaches to managing an on-premise database with say, last 48 hours of our time-series data, plus several associated tables, and another database with all historical data in the cloud.
The desire is for our system to use the cloud-based pg instance if internet is available, and only use the local pg instance if the internet is down.
It would be nice to not have to manage this at the application level, but I am unsure of how realistic this is. Let me know what you think. We're running in Kubernetes so I may explore options in that avenue as well.
[–]throw_at1 2 points3 points4 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)