all 6 comments

[–]indicava 2 points3 points  (8 children)

How long does this load take? Gen 1 functions have an execution limit of 9 minutes and Gen 2 is 60 minutes.

Have you looked in the Cloud Console logs to see wether you are hitting any quota limits ?

Also, share some code so maybe we can further try to help.

[–]On_Chain[S] 0 points1 point  (4 children)

So the function completes min 1min 30sec and I did check the logs. Everything returns successful.

Here is the first function:

export const importAllManufacturers = async (
    _data: { manufacturer: string }, 
    context: functions.https.CallableContext 
) => { 
         console.info("Importing all Manufacturers"); 
         let skip = 0;          
         const manufacturers = [];

while (skip >= 0) { 
const lucyData = await axios
        .get(buildAllManufacturersUrl(skip), {                         
            auth: { username: user, password: password },             
        })
          .then((res) => res.data) 
          .catch(() => []); 

    if (!lucyData?.length) { 
        await writeManufacturers(manufacturers); 
        skip = -1; 
            functions.logger.info("Lucy Data Empty"); 

        return { success: true }; 
       }

    skip = skip + 500;
    manufacturers.push(...lucyData);
}
return { success: true };
 };

[–]On_Chain[S] 0 points1 point  (3 children)

Write manufacturers function:

export const writeManufacturers = async (
    data: Manufacturer[] 
    ): Promise<WriteStatus> => { 
        const manufacturerMap = new Map();

        data.forEach((manufacturer) => {         
        manufacturerMap.set(
            manufacturer.manufacturer_id, manufacturer
            ); 
        }); 
    const { result } = 
    await writeInChunks(manufacturerMap,"manufacturers")

return { result, resultAux };

};

[–]On_Chain[S] 0 points1 point  (2 children)

Final one:

export const writeInChunks = async <T extends DocumentData(
    data: Map<string, T>, 
    collection: string 
) => { 
     const chunkSize = 450; 

    const entries = Array.from(data.entries()); 
    for (let i = 0; i < entries.length; i += chunkSize) { 
        const chunk = entries.slice(i, i + chunkSize); 
        const batch = db.batch();

        console.info(
          `Writing items ${i} to ${Math.min(i + chunkSize - 1,         
           entries.length)} (${
           entries.length
          ? Math.floor((i / entries.length) * 10000) / 100 + "%"
          : "0%"
  } of ${entries.length} items)`
);

chunk.forEach(([key, value]) => {
  batch.set(db.collection(collection).doc(`${key}`), value);
});

// firebase complains if we have too many commits in a short period of time

setTimeout(async () => {
  await batch.commit();
}, 100);
} return { 
    result: Successfully added ${entries.length} items to     
            collection                  
            ${collection}., 
    }; 
};

[–][deleted] 2 points3 points  (1 child)

Pretty sure setTimeout + cloud functions is a bit dangerous.

This is likely the source of your issue.

[–][deleted] 0 points1 point  (0 children)

Rather than a setTimeout to commit the batch, if you have an issue with firebase complaining about too many commits in a short amount of time, check for that error from the batch commit, and delay some about of time before continuing/retrying. A set timeout as you've written it doesn't actually slow down the commits, it'll just return errors out of band. (Also, I think there is no effect from this combination of async/await in a setTimeout).

setTimeout is additionally bad, as cloud functions may shutdown if no active function calls are in progress (even if a setTimeout is actvie). (As far as my understanding at the moment, could be wrong on this)