Posted On: Apr 07, 2020
A core is the computation unit of the CPU. In spark, cores control the total number of tasks an executor can run. It is the base foundation of the entire spark project. It assists in different types of functionalities like scheduling, task dispatching, operations of input and output and many more. Ore in the spark is the engine for distributed execution with all the functionalities that are attached at the top.
The core in the Apache spark offers the entire functionalities like fault tolerance, monitoring, in-memory computation, management of the memory, and task scheduling.
Never Miss an Articles from us.