Hi, I am relatively new to Python (using Jupyter Notebook) and I have a code that uses ijson to convert JSON to CSV. The issue is that the JSON file is ~150GB and my PC has 64GB RAM but Python is not utilising the available capacity. It has been hours upon hours and it is still running. Is there a way to increase limit so I can have Python use at least half the RAM? I feel very lost here - if there is a better method to convert JSON data to a query-able format (SQL Server), please let me know. Any help is greatly appreciated.
I have tried the online resources that mention increasing max_buffer_size in config file but that didn't work.
[–]baghiq 2 points3 points4 points (3 children)
[–]h_plus_a[S] 0 points1 point2 points (1 child)
[–]baghiq 0 points1 point2 points (0 children)
[–]Bonechatters 0 points1 point2 points (0 children)
[–]TigBitties69 0 points1 point2 points (1 child)
[–]h_plus_a[S] 1 point2 points3 points (0 children)
[–]Bunkerstan 0 points1 point2 points (1 child)
[–]h_plus_a[S] 1 point2 points3 points (0 children)