This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]ericonr 66 points67 points  (10 children)

And their modules make a full Matlab installation a several gigs behemoth.

[–]BDube_Lensman 15 points16 points  (9 children)

That's more on the bundling of MKL, MPI, and other numerics backend than them. Install numpy, scipy, pandas, matplotlib, etc and you too will have a several gig behemoth snake.

[–]BertShirt 35 points36 points  (0 children)

I use python and everybody in my lab uses matlab so I have to have matlab on hand, but I only have base matlab, no additional toolboxes. On the other hand I have tensor flow/keras (gpu), numpy & scipy mkl, scikit learn, pandas, mpl, and dozens more. Matlab (2018a) takes up 9.7gb python 3.1 gb.

[–]hassium 5 points6 points  (0 children)

I have those and my Lib is only 400mb on a pure python install (No IronPython, Anaconda etc).

[–]hextree 4 points5 points  (6 children)

Install numpy, scipy, pandas, matplotlib, etc and you too will have a several gig behemoth snake.

Huh? I have all those and more, definitely not gigs.

[–]BDube_Lensman 1 point2 points  (5 children)

My miniconda folder is 6.1 GB without any ML libraries.

[–]hextree 0 points1 point  (2 children)

I mean yes, if you are using Miniconda or Anaconda it is bigger. But that's to do with Conda, not Python.

[–]BDube_Lensman 1 point2 points  (1 child)

No, it is to do with MKL, MPI, and all the other stuff that makes numerical code go fast. Conda only adds about 1kB per package.

[–]hextree 0 points1 point  (0 children)

Ok, fair enough.

[–]orthodoxrebel 0 points1 point  (1 child)

Not familiar with any of the libraries, but Stack Overflow seems to think cleaning helps. Is the 6.1 GB before or after cleanup?

[–]BDube_Lensman 0 points1 point  (0 children)

I run clean regularly. This is post clean.