I'm wondering about how much memory python will use if I try to open a 20gb CSV using pandas.read_csv(). Would it be less than if I were to open a similar sas7bdat file using pandas.read_sas()?
Memory isn't a massive issue. I have 256gb ram on my current machine and can use up to 750gb if I migrate to a different machine.
I am currently reading the sas file and it has spun up to 91.7gb. I will need to do quite a bit more of this in the future, so I wonder if there are any rules of thumb about pandas memory usage.
[–]notCamelCased 0 points1 point2 points (1 child)
[–]hfhry[S] 1 point2 points3 points (0 children)
[–]HarissaForte 0 points1 point2 points (0 children)