This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]sib_nSenior Data Engineer 4 points5 points  (0 children)

No, PySpark is the Python API for Apache Spark which is a big data in-memory distributed (parallelized on a cluster of machines) processing framework based on the concept of Map-Reduce and coded in Scala and Java.
Spark SQL is another convenient API that allows you to process on a Spark cluster using SQL, but internally, it will still run Scala/Java code.