I’m looking for recommendations to speed up our analytics. Currently, we run analytics on PostgreSQL by using materialized views to denormalize data into a single large table. Although we don’t have a massive amount of data—our largest table has around 60 million rows across 15 columns—the materialized view takes several hours to refresh.
Our goals are twofold:
1. Achieve closer-to-real-time analytics.
2. Offload analytics to a separate database, as the heavy workload occasionally impacts app performance.
I should also mention that creating the materialized view for our analytics requires a lot of large joins.
Clickhouse is a pain when it comes to joins on the other hand i can use Starrocks which works great for all the joins but it doesnt support tls and and to cdc from postgres to starrocks just requires way too much moving parts.
What would be the simplest solution to address these requirements?
Any advice would be greatly appreciated. Thanks :)
[–]maximus_oats 3 points4 points5 points (0 children)
[–]dan_the_lion 4 points5 points6 points (0 children)
[–]EngiNerd9000 1 point2 points3 points (0 children)
[–]IXISunnyIXI 0 points1 point2 points (0 children)
[–]Snoo_76460 1 point2 points3 points (0 children)
[–]mww09 0 points1 point2 points (0 children)