Exam DP-600 All QuestionsBrowse all questions from this exam
Question 13

You are the administrator of a Fabric workspace that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:

Table1: A Delta table created by using a shortcut

Table2: An external table created by using Spark

Table3: A managed table -

You plan to connect to Lakehouse1 by using its SQL endpoint.

What will you be able to do after connecting to Lakehouse1?

    Correct Answer: A

    Upon connecting to the SQL endpoint in a Fabric lakehouse environment, you will be able to read data from managed tables. Lakehouse1's Table3 is a managed table, hence it can be read using the SQL endpoint.

Discussion
conieOption: A

B & D is out as you can’t update a table in lakehouse using SQL endpoint as this is read only. You will need to use spark or dataflows. C is out because when you create external table using spark, you can see the table from the lakehouse but you can’t see the table from SQL endpoint let alone ready. A is the answer, as I was able to see and read a managed table using SQL Endpoint

i_have_a_name

It seems there is a way to have an external table created using Spark viewed using the Lkehouse SQL End point . This is the spark code I used for testing: df.write.format("delta").option("path","Tables/externalfolder").saveAsTable("testmanagedcsveditors") When I run the catalog query : spark.catalog.listTables(), it is returning as below : Table(name='unmanagedcsveditors', catalog='spark_catalog', namespace=['training_lakehouse'], description=None, tableType='EXTERNAL', isTemporary=False) Now if I go to the SQL end point, I can see a table named "externalfolder" . I agree that the table name should be "umanagedcsveditors", I can infact see 2 tables in the Lakehouse explorer - externalfolder and unmanagedcsveditors

stilferxOption: A

IMHO, the answer is A. updates are prohibited at all. reading spark external table is a big no-no. Link: https://www.linkedin.com/pulse/use-shortcuts-instead-external-tables-reference-data-fabric-popovic/

a_51Option: A

A is correct, D is not. SQL endpoint is read-only. https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-sql-analytics-endpoint

a_51

In that link it tells you as well to modify you have to switch "To modify data in Lakehouse delta tables, you have to switch to lakehouse mode and use Apache Spark."

Aruuu6Option: A

A is answer

XiltroXOption: A

With SQL endpoint in a Lakehouse you can only read tables not update them. But that only applies to managed tables. External tables have to be read by logging inside the Lakehouse and using Spark dataframe commands.

BrandonPerksOption: A

I can confirm it is A. I manually went to my fabric tenant to investigate.

rmengOption: A

C is not correct because the external tables are not accessible from the endpoint. B & D is out of question cause as SQL endpoint is read only.

Chrys941Option: A

D is completely wrong is not typically feasible through a shortcut in SQL endpoint setup. B Generally Supported but depends on the SQL endpoint permissions for such operations A you should be able to read data from table3 since it is a managed table and such operations are standard through Sql Endpoints

GPerez73Option: A

A is correct. Tested!

mtroyanoOption: A

Options B and D are incorrect because an endpoint does not support DML operations. Option C is not correct because the external tables are not accessible from the endpoint. The correct answer is A, the managed tables can be read from the connection point.

TashaPOption: A

A. Read Table 3

AhmadpbiOption: D

D is the correct answer. In a lakehouse, the tables that can be edited using SQL endpoints are primarily Delta tables. These tables are specifically designed to be compatible with SQL analytics endpoints, allowing you to perform various SQL operations such as querying, creating views, and applying SQL security. Other table formats like Parquet, CSV, and JSON are not directly editable through SQL endpoints; they need to be converted to Delta format first.

b65eccaOption: A

External Delta tables created with Spark code won't be visible to the SQL analytics endpoint. https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-sql-analytics-endpoint So C is out. B and D are also out since with a SQL endpoint we can't update tables.

Parth_MehtaOption: A

We can read table 1 and table 3 not table 2 and can't write any of them.... so ans will be A

282b85dOption: A

Managed tables are fully controlled by the database system. These tables typically allow both read and write operations, including updates, through SQL endpoints. After connecting to Lakehouse1 using its SQL endpoint, you will be able to: - **Read Table3** (A) - **Update the data in Table3** (B) So, the correct options are: **A. Read Table3.** **B. Update the data in Table3.**

MomoanwarOption: A

End point dont modify and not read external

wojciech_wieOption: A

Answer A is correct Using SQL Endpoint we can only READ tables, so B & D is out Moreover for now we can't read external tables using SQL enpoint (https://community.fabric.microsoft.com/t5/General-Discussion/Fabric-SQL-end-point-not-showing-external-delta-tables/m-p/3475969)