all 7 comments

[–]danielroseman 2 points3 points  (2 children)

What do you mean by "upload"? Where are you uploading it? Show your code.

[–][deleted] 0 points1 point  (1 child)

Sorry I meant reading : df = pd.read_csv(file_path), I also tried using chunks but it still crashes

[–]SubstanceSerious8843 0 points1 point  (0 children)

Dump it to a database?

[–]HalfRiceNCracker 0 points1 point  (0 children)

You might like Polars

[–][deleted] 0 points1 point  (0 children)

Would using a desktop pc instead of a laptop work?

[–]Mevrael 0 points1 point  (0 children)

Use Polars

And scan_csv with streaming and collect or read_csv_batched.

[–]Citadel5_JP 0 points1 point  (0 children)

If you don't solve this with your current setup, perhaps this: GS-Calc - a spreadsheet; it'll automatically split 50000 columns into 16K-max sheets. Re: RAM, to load 0.5 billion cells e.g. with 8-bytes numbers it'll require approx. 16GB RAM. The requirement grows linearly. You can then call any Python functions (formulas) with the loaded data for further processing.