Certified Data Engineer Associate Exam QuestionsBrowse all questions from this exam

Certified Data Engineer Associate Exam - Question 74


Which of the following must be specified when creating a new Delta Live Tables pipeline?

Show Answer
Correct Answer: E

When creating a new Delta Live Tables pipeline, you must specify at least one notebook library to be executed. This is essential because the notebook contains the code and logic that defines the data processing steps and transformations. The path to cloud storage or target database locations is optional, as the system can use default locations if none are specified.

Discussion

13 comments
Sign in to comment
StemixOption: E
Jan 26, 2024

Correct answer is E. storage location is optional. "(Optional) Enter a Storage location for output data from the pipeline. The system uses a default location if you leave Storage location empty"

kishanuOption: C
Oct 20, 2023

A path to a cloud storage location for the written data - considering this option is talking about the source data being stored in cloud storage and being ingested to DLT using an autoloader.

meow_akkOption: E
Oct 22, 2023

Ans E : i think it might be E - https://docs.databricks.com/en/delta-live-tables/settings.html - this doc says that target schema and storage may be optional so it leaves us with E

Syd
Oct 29, 2023

Answer is E Storage and location are optional. https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html

HuroyeOption: E
Nov 15, 2023

The correct answer is E. DLT tables needs a notebook where you have to specify the processing info

kishore1980Option: C
Oct 28, 2023

storage location is required to be specified to control the object storage location for data written by the pipeline.

saaaaaaOption: E
Dec 19, 2023

This should be E. As per the link https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html Create a pipeline Click Jobs Icon Workflows in the sidebar, click the Delta Live Tables tab, and click Create Pipeline. Give the pipeline a name and click File Picker Icon to select a notebook. Select Triggered for Pipeline Mode. (Optional) Enter a Storage location for output data from the pipeline. The system uses a default location if you leave Storage location empty. (Optional) Specify a Target schema to publish your dataset to the Hive metastore or a Catalog and a Target schema to publish your dataset to Unity Catalog. See Publish datasets. (Optional) Click Add notification to configure one or more email addresses to receive notifications for pipeline events. See Add email notifications for pipeline events. Click Create.

GarynOption: E
Dec 31, 2023

E. At least one notebook library to be executed. Explanation: https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html Delta Live Tables pipelines execute notebook libraries as part of their operations. These notebooks contain the logic, code, or instructions defining the data processing steps, transformations, or actions to be performed within the pipeline. Specifying at least one notebook library to be executed is crucial when creating a new Delta Live Tables pipeline, as it defines the sequence of operations and the logic to be executed on the data within the pipeline, aligning with the documentation provided.

Azure_2023Option: E
Jan 16, 2024

https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html E. The only non-optional selection is a notebook

55f31c8Option: C
Nov 30, 2023

https://docs.databricks.com/en/delta-live-tables/index.html#what-is-a-delta-live-tables-pipeline

azure_bimonsterOption: E
Jan 20, 2024

As per Pipeline creating steps, choosing a Notebook is mandatory whereas specifying a location is optional. I would go with answer E

BigMFOption: C
Mar 18, 2024

Per Databaricks documentation (see below), you need to select a destination for datasets published by the pipeline, either the Hive metastore or Unity Catalog I think A is incorrect because it uses the term "Notebook Library" and not just "Notebook". Databricks doc: https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html

benni_aleOption: E
Apr 29, 2024

tbf C is correct as well but the question is probably hinting for E

Shinigami76Option: C
Jun 11, 2024

C, just tested on databricks DLT