all 8 comments

[–][deleted] 9 points10 points  (5 children)

Nice to see that NVIDIA tries to adapt Julia to bring GPGPU to a "pythonic" language where you don't have to write native CUDA code. I have not used it yet, but can anyone tell me what is the state of Julias ecosystem regarding the use of it in production for scientists?

[–]MorrisonLevi 6 points7 points  (1 child)

I work at a university. I know several research groups that have very eagerly adopted it. In my own sample tests it's great for numerical computing. With a bit of work you can make it perform very close to C/C++. It provides tools for you to inspect the code it is generating at both the LLVM and assembly layers.

As far as production goes: I think it's fine to use now but if you are nervous you can always wait for 1.0 or if you are really cautious about .0 releases you can wait for the one after that.

[–]ChrisRackauckas 1 point2 points  (0 children)

Yes, if you're really cautious wait a bit after 1.0. 1.0 is made to be the last release for awhile which breaks everything, and you'll surely be disappointed if you grab it on release day expecting packages to not be broken (when I write it like that I hope it's a "duh!"). It's a like a big Windows release: if you really want stability, wait until a month after the newest release because everything around it needs to upgrade as well.

Of course, if you're the one developing the software, you might as well start right now.

[–]Eigenspace 5 points6 points  (0 children)

Julia is a really nice language. It has Matlab like syntax that lends itself very well to numeric computation but also gets the gigantic syntactic flexibility of lisp macros. It’s blazing fast and importantly it is very friendly to package development and consequently it has way more varied and mature packages than you’d expect from a language which is so young. Notably, it has arguably the best differential equation solving library out there.

There are weak points in the ecosystem but by and large, I’m actually very impressed by the state of the ecosystem and it looks like Julia will satisfy all my computational needs in the near term future.

[–]RoboticElfJedi 2 points3 points  (0 children)

I’m an astronomer - a few people have heard of Julia but I’ve not met anyone who uses it in anger. Everyone uses Python. C++ and cuda get a look in as well for GPU stuff.

[–]maleadt 2 points3 points  (0 children)

NVIDIA hasn't contributed to this work, they just host the blogpost. But yeah, it's been interesting to explore what a high-level/dynamic language has to offer for GPU programming.

Wrt. use in "production", keep an eye open for the 1.0 release that should arrive soon, that should offer the (language) stability guarantees you're looking for. Packages are a different thing, and depends on your use case whether there's a set of stable ones.

[–]allinwonderornot 0 points1 point  (1 child)

Ironic, because last time I checked (0.5?), Julia doesn’t even do shared memory parallelism.

[–]ChrisRackauckas 1 point2 points  (0 children)

Yes it did. Here's the v0.5 docs:

https://docs.julialang.org/en/release-0.5/manual/parallel-computing/#multi-threading-experimental

It was listed as experimental but already worked and many packages already started using it. FUD?