In Hadoop, Map Reduce breaks jobs into various tasks, and these tasks run parallel rather than going for consecutive, in this manner decreases overall execution time. This model of execution is delicate to moderate tasks as they slow down the general execution of a job. There might be different explanations behind the slowdown of tasks, including hardware debasement or programming misconfiguration, yet it might be difficult to identify causes since the tasks still complete effectively, although additional time is taken than the normal time. Hadoop doesn’t attempt to analyze and settle moderate running tasks; rather, it endeavors to recognize them and runs reinforcement tasks for them. This is called speculative execution in Hadoop.