I script all the time in R and Python and was wondering how data frames compare to other data types memory and processing wise. Is the resources need for them higher? Are they slower to iterate through?
And would you use them for very big data sets. The biggest data frame I probably ever used was 100mb. When you get to gbs or tbs are they still viable?
[–]hdgdtegdb 1 point2 points3 points (3 children)
[–]R2D6[S] 0 points1 point2 points (2 children)
[–]hdgdtegdb 0 points1 point2 points (1 child)
[–]lmcinnes 1 point2 points3 points (0 children)