1.

HDFS, by default, replicates each data block _____ times on different nodes and on at least ____ racks.(a) 3, 2(b) 1, 2(c) 2, 3(d) All of the mentionedI have been asked this question in unit test.My doubt is from Data Integrity in division Hadoop I/O of Hadoop

Answer»

The correct CHOICE is (a) 3, 2

The best I can explain: HDFS has a simple yet robust architecture that was explicitly designed for DATA reliability in the face of faults and FAILURES in disks, nodes and NETWORKS.



Discussion

No Comment Found

Related InterviewSolutions