DP-200 Exam QuestionsBrowse all questions from this exam

DP-200 Exam - Question 197


You have an Azure Data Lake Storage Gen2 account. You have a number of CSV files loaded in the account. Each file has a header row. After the header row is a property that is formatted by carriage return (/r) and line feed (/n).

You need to load the files daily as a batch into Azure SQL Data warehouse using Polybase. You have to skip the header row when the files are imported.

Which of the following actions would you take to implement this requirement? (Choose three.)

Show Answer
Correct Answer: ACD

ACD

The Microsoft documentation highlights the steps required to load data from Azure Data Lake Gen2 to an Azure SQL Data warehouse.

One of the steps is to create a database scoped credential:

Exam DP-200 Question 197

Another step is to create the external data source using 'abfs' as the file location:

Create the external data source -

Use this CREATE EXTERNAL DATA SOURCE command to store the location of the data.

Exam DP-200 Question 197

And you can use the FIRST_ROW parameter to skip the first row of the file.

FIRST_ROW = First_row_int -

Specifies the row number that is read first in all files during a PolyBase load. This parameter can take values 1-15. If the value is set to two, the first row in every file (header row) is skipped when the data is loaded. Rows are skipped based on the existence of row terminators (/r/n, /r, /n). When this option is used for export, rows are added to the data to make sure the file can be read with no data loss. If the value is set to >2, the first row exported is the Column names of the external table.

Reference:

https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure-data-lake-store https://docs.microsoft.com/en-us/sql/t-sql/statements/create-external-file-format-transact-sql?view=sql-server-ver15

Discussion

1 comment
Sign in to comment
elimey
Jul 27, 2021

correct Answer is ACE