Exam DP-203 All QuestionsBrowse all questions from this exam
Question 198

You have an Azure Data Factory pipeline named Pipeline1. Pipeline1 contains a copy activity that sends data to an Azure Data Lake Storage Gen2 account.

Pipeline1 is executed by a schedule trigger.

You change the copy activity sink to a new storage account and merge the changes into the collaboration branch.

After Pipeline1 executes, you discover that data is NOT copied to the new storage account.

You need to ensure that the data is copied to the new storage account.

What should you do?

    Correct Answer: A

    In Azure Data Factory, changes made in the collaboration branch must be published to make them live. Publishing ensures that the new configurations are applied to the active pipeline. In this case, after merging the changes to the collaboration branch, the data was not copied to the new storage account because these changes were not published. Therefore, publishing from the collaboration branch will apply the new changes so that the pipeline uses the new storage account.

Discussion
kkk5566Option: A

correct

ludakaOption: A

Correct answer.

XinyuehongOption: A

I had heard "publish to", never heard of "publish from". So confused.

Igor85

probably it was meant to publish from collaboration branch to adf_publish branch

evangelistOption: A

A is correct

learnwellOption: A

The "schedule trigger" always runs in live mode, hence it will take the storage account name which is available in the live mode. Now on the other hand, when the changes in storage account name is done and merged to collaboration branch(by raising a PULL REQUEST) , it is still not yet available in ADF live mode which we need to do manually by publishing from the collaboration branch.

learnwell

So basically the data did not get copied to new storage account because in live mode the trigger still holds the old storage account name and most certainly the data got copied to the old storage account when the scheduled trigger ran. So we need to publish "from" Collaboration branch "to" ADF live mode so that the updated storage account name gets available in the ADF live mode for the scheduled trigger to pick the NEW storage account name as parameter when it triggers next time.

werfragtOption: A

I don't get it. It is saying: "After Pipeline1 executes, you discover that data is NOT copied to the new storage account. You need to ensure that the data is copied to the new storage account." So this means, the pipeline does not work. How on earth would it make sense to publish a pipeline which is not working?

mav2000

because you merged the pipeline to the collaboration branch, but since you didn't publish the changes, the pipeline didn't copy the data to the new storage account. Think of publish as deploying to live

kim32Option: B

I selected B, pull request

auwia

I guessed the same, but in adf publish step is manually and not automatic, so when the question says code is merged, you can automatically assume that pull request was done and also approved by reviewers. Therefore the correct answer for me it's A.

debarunOption: B

Why not B ?

DataEX

Because the pull request is already implicit in the statement as it is said to be merged into the collaborating branch: "You change the copy activity sink to a new storage account and MERGE the CHANGES INTO the COLLABORATION BRANCH."