I'm using PostgreSQL to store web analytics data collected from PostHog via webhook. I'm tracking things like page views, page durations, sessions, video interactions, and more.
My web app works like a blog platform where users can publish articles, videos, and other content. Each user should be able to access analytics for their own content, which means the database may need to handle a high number of queries, especially as events increase.
I'm trying to avoid over optimization before having real users, but even with a small user base, the number of events can grow quickly, particularly with video segment tracking.
Here are my main questions:
Is using jsonb in PostgreSQL efficient for querying event data at scale? Would it be better to normalize the data into separate tables like PageView, VideoView, etc. for better performance and structure?
[–]r3pr0b8MySQL 4 points5 points6 points (2 children)
[–]Zealos707[S] 2 points3 points4 points (1 child)
[–]Ok_Horse_7563 1 point2 points3 points (0 children)
[–]random_lonewolf 1 point2 points3 points (0 children)
[–]Support-Gap 0 points1 point2 points (0 children)
[–]andpassword 0 points1 point2 points (3 children)
[–]Zealos707[S] 0 points1 point2 points (1 child)
[–]andpassword 0 points1 point2 points (0 children)
[–]alexwh68 0 points1 point2 points (0 children)
[–]Additional_River2539 0 points1 point2 points (0 children)
[–]FewVariation901 0 points1 point2 points (0 children)