1.

Why is PySpark SparkConf used?

Answer»

PySpark SparkConf is used for setting the configurations and parameters required to RUN applications on a cluster or local system. The following class can be executed to run the SparkConf:

class pyspark.Sparkconf(localdefaults = True,_jvm = None,_jconf = None)

where:

  • loadDefaults - is of type boolean and indicates whether we require loading values from Java System PROPERTIES. It is True by default.
  • _jvm - This belongs to the class py4j.java_gateway.JVMView and is an internal parameter that is used for passing the handle to JVM. This need not be set by the users.
  • _jconf - This belongs to the class py4j.java_gateway.JavaObject. This parameter is an OPTION and can be used for passing existing SparkConf handles for USING the parameters.


Discussion

No Comment Found