InterviewSolution
Saved Bookmarks
| 1. |
What is PySpark Architecture? |
|
Answer» PySpark SIMILAR to Apache Spark works in master-slave architecture PATTERN. Here, the master node is called the Driver and the slave nodes are called the workers. When a Spark application is run, the Spark Driver creates SparkContext which acts as an entry point to the spark application. All the OPERATIONS are executed on the worker nodes. The RESOURCES required for executing the operations on the worker nodes are managed by the Cluster Managers. The following diagram illustrates the architecture described: |
|