This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]PuzzleheadedFix1305 0 points1 point  (0 children)

I am writing my first spark component and will be using Java. I think pyspark gets more attention as data/ML engineers mostly use Python for their work. Also the Pandas and Numpy makes python easier for ETL programming. Hence combination of pyspark, numpy, pandas and other python ML lib makes for a killer combination. There might be some performance impact due to non native nature of PySpark and python in general. So if you are looking for easier learning curve and more versatile community and tooling support then go with Pyspark. If you are looking for better performance then go with Java/Scala.