DP-200 Exam QuestionsBrowse all questions from this exam

DP-200 Exam - Question 130


DRAG DROP -

You have an Azure data factory.

You need to ensure that pipeline-run data is retained for 120 days. The solution must ensure that you can query the data by using the Kusto query language.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.

Select and Place:

Exam DP-200 Question 130
Show Answer
Correct Answer:
Exam DP-200 Question 130

Step 1: Create an Azure Storage account that has a lifecycle policy

To automate common data management tasks, Microsoft created a solution based on Azure Data Factory. The service, Data Lifecycle Management, makes frequently accessed data available and archives or purges other data according to retention policies. Teams across the company use the service to reduce storage costs, improve app performance, and comply with data retention policies.

Step 2: Create a Log Analytics workspace that has Data Retention set to 120 days.

Data Factory stores pipeline-run data for only 45 days. Use Azure Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis to multiple different targets, such as a Storage Account: Save your diagnostic logs to a storage account for auditing or manual inspection. You can use the diagnostic settings to specify the retention time in days.

Step 3: From Azure Portal, add a diagnostic setting.

Step 4: Send the data to a log Analytics workspace,

Event Hub: A pipeline that transfers events from services to Azure Data Explorer.

Keeping Azure Data Factory metrics and pipeline-run data.

Configure diagnostic settings and workspace.

Create or add diagnostic settings for your data factory.

1. In the portal, go to Monitor. Select Settings > Diagnostic settings.

2. Select the data factory for which you want to set a diagnostic setting.

3. If no settings exist on the selected data factory, you're prompted to create a setting. Select Turn on diagnostics.

4. Give your setting a name, select Send to Log Analytics, and then select a workspace from Log Analytics Workspace.

5. Select Save.

Reference:

https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

Discussion

4 comments
Sign in to comment
MsIrene
Apr 10, 2021

Storage account is not needed here. I would say the order would be like that: Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.

Kenai
Apr 11, 2021

100% agree

Maddaa
Apr 27, 2021

Agree!

cadio30
May 7, 2021

Support this solution

Wendy_DK
Apr 27, 2021

Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.

Wendy_DK
Apr 20, 2021

I agree. Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.

mric
Jun 25, 2021

According to the linked article, it's: first Storage Account, then Event Hub, and finally Log Analytics. So I would say: 1- Create an Azure Storage Account with a lifecycle policy 2- Stream to an Azure Event Hub 3- Create a Log Analytics workspace that has a Data Retention set to 120 days 4- Send the data to a Log Analytics Workspace Source: https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#keeping-azure-data-factory-metrics-and-pipeline-run-data