1.

Is It Necessary To Start Hadoop To Run Any Apache Spark Application ?

Answer»

Starting hadoop is not manadatory to run any spark application. As there is no seperate storage in Apache Spark, it uses Hadoop HDFS but it is not mandatory. The data can be STORED in local FILE SYSTEM, can be LOADED from local file system and PROCESSED.

Starting hadoop is not manadatory to run any spark application. As there is no seperate storage in Apache Spark, it uses Hadoop HDFS but it is not mandatory. The data can be stored in local file system, can be loaded from local file system and processed.



Discussion

No Comment Found