Certified Data Engineer Professional Exam QuestionsBrowse all questions from this exam

Certified Data Engineer Professional Exam - Question 34


The data architect has mandated that all tables in the Lakehouse should be configured as external Delta Lake tables.

Which approach will ensure that this requirement is met?

Show Answer
Correct Answer: C

To ensure that all tables in the Lakehouse are configured as external Delta Lake tables, the LOCATION keyword should be used whenever a table is created. This approach specifies the external storage location for each table individually, ensuring compliance with the data architect's mandate.

Discussion

7 comments
Sign in to comment
QuadronoidOption: C
Oct 30, 2023

C is correct. Location keyword should be in create script of the table

mouad_attaqiOption: C
Oct 24, 2023

C is correct, the key word to be used is Location, the keyword external is optional

chokthewaOption: D
Oct 21, 2023

The correct is D

mht3336
Jan 25, 2024

there is no EXTERNAL key word in databricks, however it is there for other systems like Oracle, Hive, Cassandra etc.

Dusica
May 31, 2024

and microsoft synapse

Yogi05Option: D
Dec 26, 2023

Why not D? i know both C and D are same, but D is more precise

Yogi05
Dec 27, 2023

my bad. D is having EXTERNAL keyword, got confused. C is correct answer

CYOption: A
Feb 10, 2024

'A' seems more appropriate. All the tables in Delta lake house should be marked as external.. which can be achieved using location keyword at database level instead of each table level.

leopedroso1Option: C
Feb 20, 2024

C is the correct answer. According to the documentation only the LOCATION is needed to make a table external. Moreover, we can also assume the keyword EXTERNAL is optional in the SQL statement. https://docs.databricks.com/en/sql/language-manual/sql-ref-external-tables.html

Laraujo2022Option: A
Nov 16, 2023

If you set a location in a database level, all tables under this database are automatically external table, in my opinion is A is correct.

Isio05
Jun 8, 2024

According to what I've found in Databricks forums: "Database location and Table location are independent". So it looks like specifying location at DB level is not sufficient as tables will be still created as managed ones.