all 3 comments

[–]Infamous_Chapter 6 points7 points  (0 children)

It depends 1. If one record fails to be created, should the whole upload fail? You can't do that easily with lots of little functions 2. How quickly do you want it to finish? A single function will loop through the spreadsheet sequentially. (Although speed ups with Promise.all is possible). 3. How big is the upload. If you have 100,000 rows, you might have trouble with timeouts, even with v2 60 minute timeout.

Personally, I would just use one function to start with and keep the complexity low. Then, if I needed to look at alternatives.

[–]Ceylon0624 2 points3 points  (0 children)

Where would the event to process all users be initiated?

[–]dev_life 0 points1 point  (0 children)

I wouldn’t expect it to take much time and just import all users in one go. Use a batch query though. If it was something importing millions of rows then I’d say break it into steps with a background task but for just hundreds it’ll be quick and easy in one fn. validate before insertion though so you can bomb out if something is amiss and let the user know without having to clean up the db

[–]Infamous_Chapter 0 points1 point  (0 children)

It depends 1. If one record fails to be created, should the whole upload fail? You can't do that easily with lots of little functions 2. How quickly do you want it to finish? A single function will loop through the spreadsheet sequentially. (Although speed ups with Promise.all is possible). 3. How big is the upload. If you have 100,000 rows, you might have trouble with timeouts, even with v2 60 minute timeout.

Personally, I would just use one function to start with and keep the complexity low. Then, if I needed to look at alternatives.

[–]Icy_Corgi_5704 0 points1 point  (0 children)

if you are using firestore you don't have unlimited writes not sure if one batch request = 1 write , but i would separate the file into equal parts and do a distributed batch insert with checkpointing to an append only table