Exam DP-203 All QuestionsBrowse all questions from this exam
Question 54

HOTSPOT -

You have an Azure Synapse Analytics dedicated SQL pool named Pool1 and an Azure Data Lake Storage Gen2 account named Account1.

You plan to access the files in Account1 by using an external table.

You need to create a data source in Pool1 that you can reference when you create the external table.

How should you complete the Transact-SQL statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

    Correct Answer:

    Box 1: blob -

    The following example creates an external data source for Azure Data Lake Gen2

    CREATE EXTERNAL DATA SOURCE YellowTaxi

    WITH ( LOCATION = 'https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/',

    TYPE = HADOOP)

    Box 2: HADOOP -

    Reference:

    https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables

Discussion
galacaw

1. dfs (for Azure Data Lake Storage Gen2)

panda_azzurro

dfs is not valid

suvec

dfs is valid Data Lake Storage Gen2 abfs[s] <container>@<storage_account>.dfs.core.windows.net http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders wasb[s] <container>@<storage_account>.blob.core.windows.net

Vedjha

CREATE EXTERNAL DATA SOURCE mydatasource WITH ( LOCATION = 'abfss://<a href="/cdn-cgi/l/email-protection" class="__cf_email__" data-cfemail="177376637657646378657670727674747862796339737164397478657239607e797378606439797263">[email protected]</a>', CREDENTIAL = AzureStorageCredential, TYPE = HADOOP )

jds0

This table corroborates that "dfs" should be used for ADLS Gen 2: https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop#location

Rob77

Correct, https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true&tabs=dedicated#location--prefixpath

Kure87

1. blob. Acoording with this article https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop we only use DFS (abfss endpoint) when your account has secure transfer enabled. On the question the location starts with "https://account1." not "abfss://"

Sebastian1677

please upvote this

vadiminski_a

That's not correct, you use abfss:// if you have secure transfer enabled. There is nothing wrong with using https:// when you don't have secure transfer enabled. However, for DLSv2 you need to specify .dfs. ... The correct answer is is: dfs hadoop

suvec

@kure87 DFS is valid Data Lake Storage Gen2 abfs[s] <container>@<storage_account>.dfs.core.windows.net http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders wasb[s] <container>@<storage_account>.blob.core.windows.net

tlb_20

As it is written in this MSF example on how to create an External data source based on Azure storage account: " TYPE = HADOOP, -- For dedicated SQL pool -- TYPE = BLOB_STORAGE, -- For serverless SQL pool " https://learn.microsoft.com/en-us/training/modules/use-azure-synapse-serverless-sql-pools-for-transforming-data-lake/2-transform-data-using-create-external-table-select-statement

Qordata

Answer is CORRECT: https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop#example-for-create-external-data-source CREATE EXTERNAL DATA SOURCE YellowTaxi WITH ( LOCATION = 'https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/', TYPE = HADOOP)

auwia

Confirmed HADOOP and DFS: External Data Source | Connector | Location path ---------------------------------------------------------------------------------------------------------------------------- Data Lake Storage Gen1 | adl | <storage_account>.azuredatalake.net Data Lake Storage Gen2 | abfs[s] | <container>@<storage_account>.dfs.core.windows.net Azure Blob Storage | wasbs | <container>@<storage_account>.blob.core.windows.net Azure Blob Storage | https | <storage_account>.blob.core.windows.net/<container>/subfolders Data Lake Storage Gen1 | http[s] | <storage_account>.azuredatalakestore.net/webhdfs/v1 Data Lake Storage Gen2 | http[s] | <storage_account>.dfs.core.windows.net/<container>/subfolders Data Lake Storage Gen2 | wasb[s] | <container>@<storage_account>.blob.core.windows.net

Reloadedvn

1. blob 2. TYPE=HADOOP Source: https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop The following example creates an external data source for Azure Data Lake Gen2 pointing to the publicly available New York data set: CREATE EXTERNAL DATA SOURCE YellowTaxi WITH ( LOCATION = 'https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/', TYPE = HADOOP)

ankeshpatel2112

Correct Answer : DFS Explnation : If your source is ADLS Gen2 then it would be "DFS" and if your source is Azure Blob Storage then "Blob". Please refer below table from Microsoft Documentation. External Data Source Connector location prefix Location path Data Lake Storage* Gen1 adl <storage_account>.azuredatalake.net Data Lake Storage Gen2 abfs[s] <container>@<storage_account>.dfs.core.windows.net Azure Blob Storage wasbs <container>@<storage_account>.blob.core.windows.net Azure Blob Storage https <storage_account>.blob.core.windows.net/<container>/subfolders Data Lake Storage Gen1 http[s] <storage_account>.azuredatalakestore.net/webhdfs/v1 Data Lake Storage Gen2 http[s] <storage_account>.dfs.core.windows.net/<container>/subfolders Data Lake Storage Gen2 wasb[s] <container>@<storage_account>.blob.core.windows.net

ypan

Dedicated SQL Pool: Use CREATE EXTERNAL DATA SOURCE with TYPE = HADOOP for accessing Azure Data Lake Storage Gen2. Serverless SQL Pool: Use OPENROWSET for direct querying of the external data.

Charley92

CREATE EXTERNAL DATA SOURCE MyDataSource WITH ( TYPE = HADOOP, LOCATION = 'abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/', CREDENTIAL = <your-credential-name> );

dgerok

answer is correct see the MS example https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=hadoop#example-for-create-external-data-source

Alongi

Should be DFS for Datalake Gen2

Momoanwar

Both `blob` and `dfs` endpoints work when connecting to Azure Data Lake Storage Gen2, but they serve different purposes. The `blob` endpoint is typically used for standard storage operations, while the `dfs` endpoint is optimized for hierarchical file system operations and is preferred for analytics workloads with Azure Synapse Analytics.

Momoanwar

To simply access files in Azure Data Lake Storage Gen2 for reading and analysis, without the need for Data Lake specific features like directory management or fine-grained ACLs, using the `blob` endpoint is sufficient. If your operations are primarily related to accessing files for reading, the `blob` endpoint can be used in the external data source definition within Azure Synapse Analytics.

fahfouhi94

Ans : dfs & hadoop https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true&tabs=dedicated

kkk5566

CREATE EXTERNAL DATA SOURCE AzureDataLakeStore WITH -- Please note the abfss endpoint when your account has secure transfer enabled ( LOCATION = 'abfss://<a href="/cdn-cgi/l/email-protection" class="__cf_email__" data-cfemail="1c787d687d5c72796b65736e77687d6475787d687d6f796832787a6f327f736e79326b757278736b6f32727968">[email protected]</a>' , CREDENTIAL = ADLS_credential , TYPE = HADOOP ) ; CREATE EXTERNAL DATA SOURCE YellowTaxi WITH ( LOCATION = 'https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/', TY HADOOP, blob

kdp203

dfs should be the correct answer (ADLS Gen2)

auwia

CREATE EXTERNAL DATA SOURCE source1 WITH ( LOCATION = 'https://account1.dfs.core.windows.net', TYPE = HADOOP )

aga444

CREATE EXTERNAL DATA SOURCE DataSourceName WITH ( TYPE = HADOOP, LOCATION = 'adl://Account1.dfs.core.windows.net/', CREDENTIAL = SqlPoolCredential );

janaki

CREATE EXTERNAL DATA SOURCE <datasource_name> WITH ( TYPE = HADOOP, LOCATION = 'adl://<account_name>.dfs.core.windows.net', CREDENTIAL = <credential_name> ); So answer is dfs and Type = Hadoop