| 1. |
Explain About The Core Components Of A Distributed Spark Application.? |
|
Answer» DRIVER: The process that runs the main () method of the program to create RDDs and perform TRANSFORMATIONS and actions on them. Executor: The worker processes that run the INDIVIDUAL tasks of a SPARK job. Cluster Manager: A pluggable component in Spark, to launch Executors and Drivers. The cluster manager allows Spark to run on top of other EXTERNAL managers like Apache Mesos or YARN. Driver: The process that runs the main () method of the program to create RDDs and perform transformations and actions on them. Executor: The worker processes that run the individual tasks of a Spark job. Cluster Manager: A pluggable component in Spark, to launch Executors and Drivers. The cluster manager allows Spark to run on top of other external managers like Apache Mesos or YARN. |
|