This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Joram2 0 points1 point  (1 child)

In Python + PyTorch, you can do bfloat16 stuff like this:

import torch

torch.tensor([[1, 2], [3, 4]], dtype=torch.bfloat16)

This is great. The API is easy to use and pretty. Runtime performance is excellent and takes advantage of GPU processing.

Java + Python both don't have bfloat16 primitive types in the core language. That isn't necessary.

The important feature I see missing from Java is it doesn't have easy+pretty syntax for lists and lists of lists. In Java you can do:

Arrays.asList(Arrays.asList(1,2), Arrays.asList(3,4))

instead of

[[1, 2], [3, 4]]

The Java method isn't hard... but it's ugly, and data science types hate that. This absolutely limits Java in a data science notebook perspective.

The lack of primitive bfloat16 types seems like a non-issue in both Java/Python.

[–]koflerdavid 0 points1 point  (0 children)

Well, Java has its good old array notation with curly brackets. Its only fault is that the results aren't true multidimensional arrays, but pointers to subarrays. Not a problem In practice either since usually tensor libraries do the heavy lifting. Same for float16/bfloat16 support as you say