Exam DP-203 All QuestionsBrowse all questions from this exam
Question 313

You have several Azure Data Factory pipelines that contain a mix of the following types of activities:

✑ Wrangling data flow

✑ Notebook

✑ Copy

✑ Jar

Which two Azure services should you use to debug the activities? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point

    Correct Answer: D, E

    When debugging activities in Azure Data Factory pipelines, Azure Data Factory itself should be used for activities like Copy and Wrangling data flow, as it provides necessary monitoring and debugging capabilities directly within its interface. For Notebook and Jar activities, Azure Databricks is more suitable because it offers a rich interactive environment that allows detailed examination and troubleshooting of code, variables, and data processes. Azure Machine Learning and Azure Synapse Analytics do not inherently provide the same level of integrated debugging capabilities for these specific types of activities as Data Factory and Databricks do.

Discussion
KrishICOptions: DE

Notebook- azure databricks, managing activities in pipeline-datafactroy

Sr18

This was on my test on 26-Jun with 930+ score, I chose above and its correct 100%

ElHomo2222Options: DE

D & E; Databricks for Wrangling and Notebooks; ADF for Copy and Jar

kilowd

Wrangling and Copy = ADF Jar and Notbooks = Databricks

nicky87654Options: DE

Wrangling and Copy = ADF Jar and Notbooks = Databricks

vctrhugoOptions: DE

D. Azure Data Factory: Azure Data Factory itself provides debugging capabilities for its activities. You can monitor and debug the execution of pipeline activities directly within the Azure Data Factory interface. It allows you to view activity run details, input/output data, logs, and diagnose any errors or issues encountered during execution. E. Azure Databricks: Azure Databricks is a powerful analytics platform that integrates well with Azure Data Factory. You can use it to debug and analyze Notebook activities within the Data Factory pipelines. Azure Databricks provides an interactive environment to run and debug notebooks, allowing you to inspect intermediate data, execute code step-by-step, and troubleshoot any issues.

janakiOptions: DE

D - Azure Data Factory E - Azure Databricks

vrodriguespOptions: DE

Notebook on azure databricks, rest on pipeline data factroy. No sense for AandC

auwiaOptions: DE

Wrangling and Copy = ADF Jar and Notbooks = Databricks

pavankrOptions: DE

You "de-bug" the activity with ML??? Seriously??? come on man??? from where you are getting these answers???

Mohamedali.CintellicOptions: DE

D & E are correct

Deeksha1234Options: DE

should be DE

AlongiOptions: DE

I found this question on my exam 30/04/2024, and I put DE. I passed the exam with a high score, but I'm not sure if the answer is correct.

swathi_rsOptions: DE

notebook-> databricks copy -> adf

MBRSDGOptions: DE

Just a detail: there's another question on examtopics, perfectly identical, with solution DE ...

Azure_2023Options: DE

he two Azure services you should use to debug the activities are Azure Data Factory and Azure Databricks. Azure Data Factory provides a comprehensive debugging experience for all types of activities, including wrangling data flows, notebooks, and copy activities. You can use the Data Factory UI or command-line tools to step through activities, inspect data, and identify errors. Azure Databricks is a cloud-based platform for big data processing that offers a rich debugging experience for Jar activities. You can use Databricks notebooks to debug Jar code, inspect variables, and set breakpoints.

kkk5566Options: DE

is correct

martinamartina

Couldn't be AD?

dsp17Options: DE

DE - correct