Home
About Us
Contact Us
Bookmark
Saved Bookmarks
Current Affairs
General Knowledge
Chemical Engineering
UPSEE
BSNL
ISRO
BITSAT
Amazon
ORACLE
Verbal Ability
→
Spark Sql Programming Tutorial
→
Spark Sql Programming Interview Questions in Spark Sql Programming Tutorial
→
Explain About The Common Workflow Of A Spark Prog...
1.
Explain About The Common Workflow Of A Spark Program
Answer»
The foremost step in a Spark program
INVOLVES
creating input RDD's from
EXTERNAL
data.
Use various RDD
TRANSFORMATIONS
like filter() to create new transformed RDD's based on the
BUSINESS
logic.
persist() any intermediate RDD's which
MIGHT
have to be reused in future.
Launch various RDD actions() like first(), count() to begin parallel computation , which will then be optimized and executed by Spark.
Show Answer
Discussion
No Comment Found
Post Comment
Related InterviewSolutions
How Sparksql Is Different From Hql And Sql?
What Are Benefits Of Spark Over Mapreduce?
What Is A “parquet” In Spark?
What Is Hive On Spark?
What Is Spark?
List The Functions Of Spark Sql.
What Is A Parquet File?
What Is Spark Sql?
Can We Do Real-time Processing Using Spark Sql?
What Is “spark Sql”?
Reply to Comment
×
Name
*
Email
*
Comment
*
Submit Reply
Your experience on this site will be improved by allowing cookies. Read
Cookie Policy
Reject
Allow cookies