DP-201 Exam QuestionsBrowse all questions from this exam

DP-201 Exam - Question 88


HOTSPOT -

You are designing an Azure Data Factory solution that will download up to 5 TB of data from several REST APIs.

The solution must meet the following staging requirements:

✑ Ensure that the data can be landed quickly and in parallel to a staging area.

✑ Minimize the need to return to the API sources to retrieve the data again should a later activity in the pipeline fail.

The solution must meet the following analysis requirements:

✑ Ensure that the data can be loaded in parallel.

✑ Ensure that users and applications can query the data without requiring an additional compute engine.

What should you include in the solution to meet the requirements? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

Exam DP-201 Question 88
Show Answer
Correct Answer:
Exam DP-201 Question 88

Box 1: Azure Blob storage -

When you activate the staging feature, first the data is copied from the source data store to the staging storage (bring your own Azure Blob or Azure Data Lake

Storage Gen2).

Box 2: Azure Synapse Analytics -

The Azure Synapse Analytics connector in copy activity provides built-in data partitioning to copy data in parallel.

Reference:

https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance-features https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-sql-data-warehouse

Discussion

2 comments
Sign in to comment
anamaster
Apr 18, 2021

correct, but the explanation for synapse is that ASA allows querying

Marcus1612
Sep 30, 2021

Look at this: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance-features When you activate the staging feature, first the data is copied from the source data store to the staging storage (bring your own Azure Blob or Azure Data Lake Storage Gen2). Next, the data is copied from the staging to the sink data store. The copy activity automatically manages the two-stage flow for you, and also cleans up temporary data from the staging storage after the data movement is complete.