1.

Which One Will You Choose For A Project –hadoop Mapreduce Or Apache Spark?

Answer»

As it is KNOWN that Spark makes use of memory instead of network and disk I/O. However, Spark uses LARGE amount of RAM and requires DEDICATED MACHINE to produce effective results. So the decision to use Hadoop or Spark varies DYNAMICALLY with the requirements of the project and budget of the organization.

As it is known that Spark makes use of memory instead of network and disk I/O. However, Spark uses large amount of RAM and requires dedicated machine to produce effective results. So the decision to use Hadoop or Spark varies dynamically with the requirements of the project and budget of the organization.



Discussion

No Comment Found