DP-600 Exam QuestionsBrowse all questions from this exam

DP-600 Exam - Question 13


You are the administrator of a Fabric workspace that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:

Table1: A Delta table created by using a shortcut

Table2: An external table created by using Spark

Table3: A managed table -

You plan to connect to Lakehouse1 by using its SQL endpoint.

What will you be able to do after connecting to Lakehouse1?

Show Answer
Correct Answer: A

Upon connecting to the SQL endpoint in a Fabric lakehouse environment, you will be able to read data from managed tables. Lakehouse1's Table3 is a managed table, hence it can be read using the SQL endpoint.

Discussion

17 comments
Sign in to comment
conieOption: A
Feb 10, 2024

B & D is out as you can’t update a table in lakehouse using SQL endpoint as this is read only. You will need to use spark or dataflows. C is out because when you create external table using spark, you can see the table from the lakehouse but you can’t see the table from SQL endpoint let alone ready. A is the answer, as I was able to see and read a managed table using SQL Endpoint

i_have_a_name
May 30, 2024

It seems there is a way to have an external table created using Spark viewed using the Lkehouse SQL End point . This is the spark code I used for testing: df.write.format("delta").option("path","Tables/externalfolder").saveAsTable("testmanagedcsveditors") When I run the catalog query : spark.catalog.listTables(), it is returning as below : Table(name='unmanagedcsveditors', catalog='spark_catalog', namespace=['training_lakehouse'], description=None, tableType='EXTERNAL', isTemporary=False) Now if I go to the SQL end point, I can see a table named "externalfolder" . I agree that the table name should be "umanagedcsveditors", I can infact see 2 tables in the Lakehouse explorer - externalfolder and unmanagedcsveditors

a_51Option: A
Mar 19, 2024

A is correct, D is not. SQL endpoint is read-only. https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-sql-analytics-endpoint

a_51
Mar 19, 2024

In that link it tells you as well to modify you have to switch "To modify data in Lakehouse delta tables, you have to switch to lakehouse mode and use Apache Spark."

stilferxOption: A
May 8, 2024

IMHO, the answer is A. updates are prohibited at all. reading spark external table is a big no-no. Link: https://www.linkedin.com/pulse/use-shortcuts-instead-external-tables-reference-data-fabric-popovic/

BrandonPerksOption: A
Feb 18, 2024

I can confirm it is A. I manually went to my fabric tenant to investigate.

XiltroXOption: A
Feb 26, 2024

With SQL endpoint in a Lakehouse you can only read tables not update them. But that only applies to managed tables. External tables have to be read by logging inside the Lakehouse and using Spark dataframe commands.

Aruuu6Option: A
Mar 6, 2024

A is answer

TashaPOption: A
Feb 25, 2024

A. Read Table 3

mtroyanoOption: A
Mar 19, 2024

Options B and D are incorrect because an endpoint does not support DML operations. Option C is not correct because the external tables are not accessible from the endpoint. The correct answer is A, the managed tables can be read from the connection point.

GPerez73Option: A
Apr 10, 2024

A is correct. Tested!

Chrys941Option: A
Apr 27, 2024

D is completely wrong is not typically feasible through a shortcut in SQL endpoint setup. B Generally Supported but depends on the SQL endpoint permissions for such operations A you should be able to read data from table3 since it is a managed table and such operations are standard through Sql Endpoints

rmengOption: A
May 2, 2024

C is not correct because the external tables are not accessible from the endpoint. B & D is out of question cause as SQL endpoint is read only.

wojciech_wieOption: A
Feb 17, 2024

Answer A is correct Using SQL Endpoint we can only READ tables, so B & D is out Moreover for now we can't read external tables using SQL enpoint (https://community.fabric.microsoft.com/t5/General-Discussion/Fabric-SQL-end-point-not-showing-external-delta-tables/m-p/3475969)

MomoanwarOption: A
Feb 17, 2024

End point dont modify and not read external

282b85dOption: A
May 27, 2024

Managed tables are fully controlled by the database system. These tables typically allow both read and write operations, including updates, through SQL endpoints. After connecting to Lakehouse1 using its SQL endpoint, you will be able to: - **Read Table3** (A) - **Update the data in Table3** (B) So, the correct options are: **A. Read Table3.** **B. Update the data in Table3.**

Parth_MehtaOption: A
Jun 12, 2024

We can read table 1 and table 3 not table 2 and can't write any of them.... so ans will be A

b65eccaOption: A
Jul 5, 2024

External Delta tables created with Spark code won't be visible to the SQL analytics endpoint. https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-sql-analytics-endpoint So C is out. B and D are also out since with a SQL endpoint we can't update tables.

AhmadpbiOption: D
Jul 14, 2024

D is the correct answer. In a lakehouse, the tables that can be edited using SQL endpoints are primarily Delta tables. These tables are specifically designed to be compatible with SQL analytics endpoints, allowing you to perform various SQL operations such as querying, creating views, and applying SQL security. Other table formats like Parquet, CSV, and JSON are not directly editable through SQL endpoints; they need to be converted to Delta format first.