Characterize Data Integrity? How does HDFS ensure information integrity of data blocks squares kept in HDFS?

devquora
devquora

Posted On: Feb 22, 2018

 

Data Integrity discusses the accuracy of the information. It is essential for us to have a guarantee or assurance that the information kept in HDFS is right. However, there is dependably a slight chance that the information will be corrupted during I/O tasks on the disks. HDFS makes the checksum for all of the information kept in touch with it and confirmed the information with the checksum during the read activity of course. Additionally, each Data-Node runs a block scanner occasionally, which checks the accuracy of the information blocks kept in the HDFS.

    Related Questions

    Please Login or Register to leave a response.

    Related Questions

    Hadoop Interview Questions

    What does HDFS mean?

    The HDFS is one of the storage systems of the Hadoop structure. It is a circulated file structure that can helpfully ke..

    Hadoop Interview Questions

    What are the key features of HDFS?

    Various key features of HDFS are as follows: HDFS is a profoundly versatile and reliable storage system for big data st..

    Hadoop Interview Questions

    What is check pointing in Hadoop?

    Check pointing is a fundamental part of keeping up and holding on file system metadata in HDFS. It’s urgent for profi..