1.

What Are The Modules That Constitute The Apache Hadoop 2.0 Framework?

Answer»

Hadoop 2.0 contains four important MODULES of which 3 are inherited from Hadoop 1.0 and a new module YARN is added to it.

Hadoop Common – This module consists of all the BASIC utilities and libraries that required by other modules.

HDFS- Hadoop Distributed file system that STORES huge volumes of data on commodity machines across the cluster.

MapReduce- JAVA BASED programming model for data processing.

YARN- This is a new module introduced in Hadoop 2.0 for cluster resource management and job scheduling.

Hadoop 2.0 contains four important modules of which 3 are inherited from Hadoop 1.0 and a new module YARN is added to it.

Hadoop Common – This module consists of all the basic utilities and libraries that required by other modules.

HDFS- Hadoop Distributed file system that stores huge volumes of data on commodity machines across the cluster.

MapReduce- Java based programming model for data processing.

YARN- This is a new module introduced in Hadoop 2.0 for cluster resource management and job scheduling.



Discussion

No Comment Found