Posted On: Feb 22, 2018
Data Integrity discusses the accuracy of the information. It is essential for us to have a guarantee or assurance that the information kept in HDFS is right. However, there is dependably a slight chance that the information will be corrupted during I/O tasks on the disks. HDFS makes the checksum for all of the information kept in touch with it and confirmed the information with the checksum during the read activity of course. Additionally, each Data-Node runs a block scanner occasionally, which checks the accuracy of the information blocks kept in the HDFS.
Never Miss an Articles from us.
The HDFS is one of the storage systems of the Hadoop structure. It is a circulated file structure that can helpfully ke..
Various key features of HDFS are as follows: HDFS is a profoundly versatile and reliable storage system for big data st..
Check pointing is a fundamental part of keeping up and holding on file system metadata in HDFS. It’s urgent for profi..