1.

What are the key component of spark which internally spark require to execute the job?

Answer»
  •  Spark follows a master/SLAVE architecture.
    •  Master Daemon: (Master Drive process)
    •  Worker Daemon: (Slave process)
  • Spark cluster has a single Master
  • No. of Slave worked as a COMMODITY server.
  • When we SUBMIT the spark job it triggers the spark driver. 
  • Getting the current STATUS of spark application
  • Canceling the job
  • Canceling the Stage
  • Running job synchronously
  • Running job asynchronously
  • Accessing PERSISTENT RDD
  • Un-persisting RDD
  • Programmable dynamic allocation


Discussion

No Comment Found