1.

What is Block and what role does Block Scanner play in HDFS?

Answer»

BLOCKS are considered as the smallest unit of data that is allocated to a file that is created automatically by the Hadoop System for STORAGE of data in a different set of NODES in a distributed system. Large files are automatically sliced into small chunks called as blocks by Hadoop.

Block scanner as its name suggests, is used to verify whether the small chunks of files known as blocks that are created by Hadoop are successfully stored in DataNode or not. It helps to DETECT the corrupt blocks present in DataNode.



Discussion

No Comment Found