Certified Associate Developer for Apache Spark Exam QuestionsBrowse all questions from this exam

Certified Associate Developer for Apache Spark Exam - Question 130


Which of the following will cause a Spark job to fail?

Show Answer
Correct Answer: E

A failed driver node will cause a Spark job to fail because the driver node is responsible for orchestrating the entire execution process, including scheduling tasks and managing their execution. If the driver node fails, the job cannot continue. In contrast, the failure of a worker node (option D) can be handled by Spark’s fault-tolerance mechanisms, which can retry tasks on other nodes. The other options (A, B, and C) describe scenarios that would lead to degraded performance or require resource adjustments but would not directly cause a Spark job to fail.

Discussion

1 comment
Sign in to comment
Sowwy1Option: E
Apr 10, 2024

E is correct