InterviewSolution
Saved Bookmarks
| 1. |
Why is PySpark SparkConf used? |
|
Answer» PySpark SparkConf is used for setting the configurations and parameters required to RUN applications on a cluster or local system. The following class can be executed to run the SparkConf: class pyspark.Sparkconf(localdefaults = True,_jvm = None,_jconf = None)where:
|
|