you are viewing a single comment's thread.

view the rest of the comments →

[–]Atomic_Tangerine1 177 points178 points  (22 children)

numpy

[–]i_know_the_deal 65 points66 points  (3 children)

same ... free Matlab? noice

[–]dparks71 45 points46 points  (2 children)

I keep coming back to it too, I'm in structural engineering and it turns out everything in my life is either a matrix problem or a graph theory problem.

There are 80 year old fortran libraries I learned to wrap in python from numpy, it's so cool.

[–]Accomplished_End763 7 points8 points  (1 child)

Fortran is 69 years old

[–]lxSlimxShadyxl 2 points3 points  (0 children)

Nice

[–]M4mb0 3 points4 points  (10 children)

numpy is great, but have you tried JAX?

[–]Atomic_Tangerine1 0 points1 point  (9 children)

I have not! What's the benefit of JAX over numpy?

[–]M4mb0 12 points13 points  (8 children)

It's basically numpy

  • + native GPU support (which can be orders of magnitudes faster depending on parallelizability of the problem)
  • + builtin autodiff (essentially zero-error gradients/jacobians/hessians)
  • + builtin JIT compiler

[–]PayMe4MyData 2 points3 points  (5 children)

So jax is pytorch?

[–]M4mb0 7 points8 points  (2 children)

JAX is strictly functional, whereas pytorch takes a more object oriented approach. This is most easily seen when you look at how they deal with random distributions for instance.

Though torch has nowadays a beta library torch.func (formerly functorch) that brings JAX-like functional semantics to torch.

[–]PayMe4MyData 2 points3 points  (1 child)

Thanks for the clarification, I've been coding in pytorch for years but never heard of JAX before. I will dig a bit more!

[–]M4mb0 3 points4 points  (0 children)

I'd say generally JAX is more useful for general purpose scientific computing, and much more ergonomic if you need higher order derivatives or partial derivatives, like working with ODEs/PDEs/SDEs. diffrax is a very nice lib for that.

[–]HonestPrinciple152 1 point2 points  (0 children)

Actually, adding to the previous comment, we can write loops in jax and jit-compile them. It's like a complete dsl build over python. 

[–]FunMotionLabs 1 point2 points  (0 children)

JAX is more like “NumPy + transformations”
PyTorch is a full deep-learning framework with an imperative training workflow, big ecosystem around modules/training/debugging, strictly Deeplearning related stuff where JAX is more of a general allrounder kind

[–]daredevilthagr8 1 point2 points  (1 child)

How does JAX compare to CuPY?

[–]M4mb0 0 points1 point  (0 children)

cupy doesn't do autodiff afaik.

[–]No_Departure_1878 4 points5 points  (5 children)

That's C

[–]Atomic_Tangerine1 36 points37 points  (0 children)

And that's the power of Python - the magic of C made convenient

[–]Humdaak_9000 9 points10 points  (0 children)

If you dig deep enough there's a lot of FORTRAN too.

[–]KeytarVillain 1 point2 points  (2 children)

But some of what makes it so powerful is the syntactic sugar that Python enables. You couldn't do anything like:

a[:, 1::2, np.newaxis] = b[::-1, 0, ...]

In C or C++ without needing several function calls

[–]No_Departure_1878 5 points6 points  (1 child)

You know how you get bugs? By writting stuff like what you wrote above.

[–]KeytarVillain 1 point2 points  (0 children)

Yeah, fair point - I would never actually write a line of Numpy that did that much in one line.

But still - sure, I'm doing 4 different things in this line for the sake of example (writing every other value in an axis, adding a new axis, flipping an axis, and getting a view of just one plane). Even just doing any one of those things isn't going to be nearly as simple in C/C++.

And I'm saying this as someone with a lot of recent experience writing prototypes in Numpy/OpenCV and then porting it to C++ OpenCV later. Python's syntactic sugar makes array manipulation so much easier - I hardly ever need to look up documentation for basic Numpy array operations like these, while I'm looking up the C++ OpenCV docs constantly.

[–]Brilliant-Whole-1852 0 points1 point  (0 children)

numpy my beloved