1.

________ is a programming model designed for processing large volumes of data in parallel by dividing the work into a set of independent tasks.(a) Hive(b) MapReduce(c) Pig(d) LuceneThis question was posed to me in quiz.Asked question is from Data Flow topic in portion HDFS – Hadoop Distributed File System of Hadoop

Answer»

Right OPTION is (b) MapReduce

Easiest EXPLANATION: MapReduce is the HEART of hadoop.



Discussion

No Comment Found

Related InterviewSolutions