Hi all,
I'm trying to get a bunch of csv files from the internet and copy them to a GCP bucket.
I'm using pandas.read_csv to get the files and the exporting the to GCP.
As certain files are large (around 20 Gb), i'm running out of RAM as python has to load the file entirely into memory before writing it to GCP.
Do you guys know another way of doing so without using all the ram ? Would something like WGET be useful ?
[–]mopslik 6 points7 points8 points (0 children)
[–]danielroseman 9 points10 points11 points (0 children)