Exam DP-300 All QuestionsBrowse all questions from this exam
Question 207

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.

Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse.

Does this meet the goal?

    Correct Answer: A

    Scheduling an Azure Databricks job that executes an R notebook allows for the transformation of data from the staging zone in Azure Data Lake Storage and subsequent insertion into the data warehouse in Azure Synapse Analytics. Azure Databricks provides capabilities for both executing R scripts and managing data transformations. Given the requirements of ingesting incremental data, transforming it with an R script, and inserting it into a data warehouse, this solution fits the need effectively.

Discussion
cusman

DP-203

U_COption: A

Yes, this solution meets the goal. By scheduling an Azure Databricks job that executes an R notebook, you can transform the data from the staging zone in your Azure Data Lake Storage account. Then, by inserting the data into the data warehouse in Azure Synapse Analytics, you can complete the daily process of ingesting incremental data. So, the answer is A. Yes.

Sr18

DP-203, Question! BUT Ans is Yes Databricks can do that and very easily.

KIET2131Option: A

A. Yes, this solution meets the goal of ingesting incremental data from the staging zone, transforming the data by executing an R script, and inserting the transformed data into a data warehouse in Azure Synapse Analytics using Azure Databricks. The scheduled Azure Databricks job can be used to execute the R notebook and insert the transformed data into the data warehouse.

Ciupaz

Not related to DP-300 exam.

o2091Option: A

looks good, what do you think?