How can we create RDDs in Apache spark?

devquora
devquora

Posted On: Feb 22, 2018

 

In order to create RDD there are basically two methods –

  • In a driver program parallelizes a collection. This will use the sparkcontext's “Parallelize”.
  • From the external storage, an external data sheet can be loaded into the file system.

    Related Questions

    Please Login or Register to leave a response.

    Related Questions

    Apache Spark Interview Questions

    What is Apache Spark?

    Apache Spark is basically a processing framework which is extremely fast and convenient to use...

    Apache Spark Interview Questions

    Can you mention some features of spark?

    On a general note, the most essential features of Apache Spark are-..

    Apache Spark Interview Questions

    Can you define RDD?

    The acronym for Resale in Distributed Datasheets is RDD...