Exam Certified Associate Developer for Apache Spark All QuestionsBrowse all questions from this exam
Question 130

Which of the following will cause a Spark job to fail?

    Correct Answer: E

    A failed driver node will cause a Spark job to fail because the driver node is responsible for orchestrating the entire execution process, including scheduling tasks and managing their execution. If the driver node fails, the job cannot continue. In contrast, the failure of a worker node (option D) can be handled by Spark’s fault-tolerance mechanisms, which can retry tasks on other nodes. The other options (A, B, and C) describe scenarios that would lead to degraded performance or require resource adjustments but would not directly cause a Spark job to fail.

Discussion
Sowwy1Option: E

E is correct