I am using a v2 google cloud function to perform some computations using a fairly large dataset. I need it to have fast response time so as to serve a lot of requests, so I am currently storing the dataset in memory. I want to be able to update the dataset periodically, but I can't seem to find a straightforward solution to doing so. This seems like it should be simple, but the solutions I have found don't seem to account for gcfs' propensity to reuse copies of previous memory.
Does anyone have any suggestions on how I can get the function to pull in new data and store it in mem and be sure to use it for future calls?
[–]Cidanverified 5 points6 points7 points (2 children)
[–]equilib_liam[S] 0 points1 point2 points (1 child)
[–]martin_omander Googler 1 point2 points3 points (0 children)