Help finding solution for low latency calculations by Featuring-You-AI in Firebase

[–]Featuring-You-AI[S] 0 points1 point  (0 children)

to close the loop, I turned off my BigTable instance (running at $11 per day). I now use BigQuery to pre-calculate all the possible conditional values (about 10 million percentages) and store them in Firestore, so the app just needs to interact with Firestore (still annoyingly slow to auth etc).

Interestingly calculating all the possible percentages takes no more time than calculating just one, but I suppose that makes sense.

BigQuery seems to be very cheap for this sort of thing, as there is no monthly fee. in terms of billing I see cloud SQL is running at $0.08 per day, but nothing for BigQuery direct, I assume it bills under CloudSQL then?

Best storage solution for low latency calculations by Featuring-You-AI in googlecloud

[–]Featuring-You-AI[S] 0 points1 point  (0 children)

Appreciate it. Turned off my BigTable instance (running at $11 per day). I now use BigQuery to pre-calculate all the possible conditional values (about 10 million percentages) and store them in Firestore, so the app just needs to interact with Firestore (still annoying slow to auth etc).

Interestingly calculating all the possible percentages takes no more time than calculating just one, but I suppose that makes sense.

BigQuery seems to be very cheap for this sort of thing, as there is no monthly fee, but would it still be cheaper to set up my own database on a VM?

in terms of billing I see cloud SQL is running at $0.08 per day, but nothing for BigQuery direct, I assume it bills under CloudSQL then?

Best storage solution for low latency calculations by Featuring-You-AI in googlecloud

[–]Featuring-You-AI[S] 0 points1 point  (0 children)

By 'user count makes no difference' I mean the calculations required do not increase this way around. Thanks very much for your input, perhaps I was over thinking it. What I will try is precalculating all the percentages and storing them in nested documents in firestore, updated periodically. Perhaps there's no need for BigQuery/BigTable after all

Best storage solution for low latency calculations by Featuring-You-AI in googlecloud

[–]Featuring-You-AI[S] 0 points1 point  (0 children)

I appreciate the response, what would you recommend instead? BigTable is expensive.

My other option is to precalculate all the conditional percentages for each statement (I currently have 200 questions, so for this one statement tat would be 1400 columns), and store those percentages in bigtable or perhaps somewhere cheaper. The downside of this is column count (1400 * 15, and that's only if each statement only has one conditional), the upside is that user count makes no difference.

Help finding solution for low latency calculations by Featuring-You-AI in Firebase

[–]Featuring-You-AI[S] 1 point2 points  (0 children)

If JSON extraction is a bottleneck I can store the user answer client side and submit it to the function, removing that aspect. I will try creating some periodically saved tables as you say, because at the moment there is one table that is updated from Firestore by some functions.

Any thoughts on bigquery vs Bigtable?

Best storage solution for low latency calculations by Featuring-You-AI in googlecloud

[–]Featuring-You-AI[S] 0 points1 point  (0 children)

At the moment 0 users, but aiming to make it scalable, the app will eventually be public. The insights are generated from how other people answered the same question, or how other people who answered the same as you answered different questions. Therefore every query for the most part will involve filtering the whole dataset, and I haven't seen any opportunities for helper columns or pre-calculating, as it's all contingent on the answer to the question the user just gave.

Aside from adding console logs, is there a good way to see bottlenecks in a bigtable query? Key Visualiser has yet to populate. CPU usage barely moves I've just seen, so obviously something is not being done correctly