DP-200 Exam QuestionsBrowse all questions from this exam

DP-200 Exam - Question 158


HOTSPOT -

You have a new Azure Data Factory environment.

You need to periodically analyze pipeline executions from the last 60 days to identify trends in execution durations. The solution must use Azure Log Analytics to query the data and create charts.

Which diagnostic settings should you configure in Data Factory? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

Exam DP-200 Question 158
Show Answer
Correct Answer:
Exam DP-200 Question 158

Log type: PipelineRuns -

A pipeline run in Azure Data Factory defines an instance of a pipeline execution.

Storage location: An Azure Storage account

Data Factory stores pipeline-run data for only 45 days. Use Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis. You can also keep them in a storage account so that you have factory information for your chosen duration.

Save your diagnostic logs to a storage account for auditing or manual inspection. You can use the diagnostic settings to specify the retention time in days.

Reference:

https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

Discussion

12 comments
Sign in to comment
AUgigi
Mar 2, 2020

Should it be sent to Azure Log Analytics? Considering "The solution must use Azure Log Analytics to query the data and create charts."?

avestabrzn
Mar 8, 2020

You need to send your data to storage account in order to query your logs and create charts via Log Analytics. But instead, You can directly store it in Log Analytics. Question is still tricky though.

krisspark
Jun 28, 2020

The correct answer is "storage account" as the question asked for explicit "60 days" retention which is not available in Azure Log Analytics & Event hub configs..

akn1
Jul 9, 2020

Azure Monitor - It is also possible to specify different retention settings for individual data types from 30 to 730 days https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage#retention-by-data-type

SachinKumar2
Mar 7, 2020

Yes..it should be Azure Log analytics

Rohan21
Jun 20, 2020

Storage Account is correct as using Storage Account one can specify retention days but same is not with Log Analytics.

amar111
Jul 1, 2020

Retention period is no where mentioned in question .

amar111
Jul 1, 2020

its just what time period you want to query on

akram786
Mar 9, 2021

It should be log analytics : https://docs.microsoft.com/en-us/answers/questions/214414/data-factory-diagnostic-settings.html

AyeshJr
Jan 22, 2021

Log analytics should be the correct answer since it can retain data for longer periods of times which in turn means that you can query the data in the past 60 days. https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage#log-analytics-and-security-center

satyamkishoresingh
Sep 28, 2021

Agreed with Retention but where in link it said ADF can directly store those info in log Analytics ?

rsm2020
Aug 29, 2020

It should be Azure Log Analytics.

syu31svc
Nov 28, 2020

Question already gave the answer for the storage location

dumpsm42
Dec 8, 2020

hi to all, no. krisspark is correct. you must pay attention to the retention period => 60 days and azure log does not keep for 60 days, so you must use an storage account. regards

calvintcy
Dec 19, 2020

Should be Azure Log Analytics

KRV
Jan 7, 2021

Since pipeline executions have to be analyzed beyond 45 days , which is 60 in this case , the "Azure Log Analytics" will be incorrect answer choice and the right answer should be as "Azure Storage Account" which is already correctly selected out here , the catch is to read the question Using Azure Monitor you can route it to multiple different targets https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

AyeshJr
Jan 22, 2021

That is incorrect, Log analytics can retain data for a cost.

hoangton
May 18, 2021

Data Factory stores pipeline-run data for only 45 days. Use Azure Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis to multiple different targets.

hoangton
May 18, 2021

Data Factory stores pipeline-run data for only 45 days. Use Azure Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis to multiple different targets.

Pairon
Apr 1, 2021

I don't know if it's possible to choose "Azure Logs Analytics" as sink in Data Factory. If my thought is correct, the correct answer is Storage account.