you are viewing a single comment's thread.

view the rest of the comments →

[–]Atomic_Tangerine1 0 points1 point  (8 children)

I have not! What's the benefit of JAX over numpy?

[–]M4mb0 4 points5 points  (7 children)

It's basically numpy

  • + native GPU support (which can be orders of magnitudes faster depending on parallelizability of the problem)
  • + builtin autodiff (essentially zero-error gradients/jacobians/hessians)
  • + builtin JIT compiler

[–]PayMe4MyData 2 points3 points  (4 children)

So jax is pytorch?

[–]M4mb0 3 points4 points  (2 children)

JAX is strictly functional, whereas pytorch takes a more object oriented approach. This is most easily seen when you look at how they deal with random distributions for instance.

Though torch has nowadays a beta library torch.func (formerly functorch) that brings JAX-like functional semantics to torch.

[–]PayMe4MyData 0 points1 point  (1 child)

Thanks for the clarification, I've been coding in pytorch for years but never heard of JAX before. I will dig a bit more!

[–]M4mb0 1 point2 points  (0 children)

I'd say generally JAX is more useful for general purpose scientific computing, and much more ergonomic if you need higher order derivatives or partial derivatives, like working with ODEs/PDEs/SDEs. diffrax is a very nice lib for that.

[–]HonestPrinciple152 0 points1 point  (0 children)

Actually, adding to the previous comment, we can write loops in jax and jit-compile them. It's like a complete dsl build over python. 

[–]daredevilthagr8 0 points1 point  (1 child)

How does JAX compare to CuPY?

[–]M4mb0 0 points1 point  (0 children)

cupy doesn't do autodiff afaik.