1.

Explain About The Core Components Of A Distributed Spark Application.

Answer»
  • Driver- The PROCESS that RUNS the main () method of the PROGRAM to create RDDs and perform transformations and actions on them.
  • Executor –The worker processes that run the individual tasks of a Spark job.
  • Cluster Manager-A pluggable component in Spark, to launch EXECUTORS and Drivers. The cluster manager allows Spark to run on top of other external MANAGERS like Apache Mesos or YARN.



Discussion

No Comment Found