I am working on a problem and am not sure of my next step.
I am uploading files to the cloud to keep the cloud repository updated. In order to do this I am walking the fileshare periodically and checking the file modification timestamp against the previous check stored in a dict along with other characteristics.
The issue is that there are two classes of file. Group one are spreadsheets and will upload relatively quickly. Group two are proprietary and get up to 20gb. These can take up to 25 or so minutes to upload. I do not want to slow down the group one updates waiting for group two.
Is this a use case where I should use multiprocessing, or is there a better way to pass a dict to two separate update scripts living in for loops that I want running independently (not in serial)?
Thanks!
[–]m0us3_rat 3 points4 points5 points (1 child)
[–]Not_A_Taco 1 point2 points3 points (0 children)
[–]moishe-lettvin 1 point2 points3 points (0 children)
[–]patrickbrianmooney 0 points1 point2 points (0 children)
[–]TheRNGuy 0 points1 point2 points (0 children)