DP-700 Exam QuestionsBrowse all questions from this exam

DP-700 Exam - Question 63


HOTSPOT -

You have an Azure Event Hubs data source that contains weather data.

You ingest the data from the data source by using an eventstream named Eventstream1. Eventstream1 uses a lakehouse as the destination.

You need to batch ingest only rows from the data source where the City attribute has a value of Kansas. The filter must be added before the destination. The solution must minimize development effort.

What should you use for the data processor and filtering? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Exam DP-700 Question 63
Show Answer
Correct Answer:
Exam DP-700 Question 63

Discussion

6 comments
Sign in to comment
4371883
Jan 25, 2025

1. eventstream with an external data source 2. eventstream processor https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/add-source-azure-event-hubs?pivots=enhanced-capabilities https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/process-events-using-event-processor-editor?pivots=enhanced-capabilities

18e18d0
Feb 8, 2025

The answer is correct. The question states that eventstream already exists and uses the lakehouse as destination. The question also states that the rows need to be batch ingested. Thus 1) Dataflow and 2) Filter activity are the best in this situation

henryphchan
Feb 17, 2025

The provided answer. This question is asking how you can batch ingest only rows from the data source where the City attribute has a value of Kansas. To minimize development effeort, the data processor must be DataFlow Gen2 and the filtering should use the Filter in DataFlow Gen2

zxc01
Apr 1, 2025

the problem is "You need to batch ingest only rows from the data source where the City attribute has a value of Kansas." where is the data source in this one? it is very hard to build dataflow Gen2 if this data source is event hub. key word "batch ingest" looks like point to dataflow Gen2. And question said they already set eventstream to save data in lakehouse. If this data source means the table in lakehouse, then I can agree dataflow Gen2 is best option.

hebertorosillo
Mar 22, 2025

1. eventstream with an external data source 2. eventstream processor . Dataflow not support Event Hubs

13d2a97
Apr 19, 2025

Data processor: An eventstream with a custom endpoint Eventstreams allow real-time processing of data using processors, and a custom endpoint provides flexibility in routing data downstream, especially after transformations or filters. Filtering: An eventstream processor Eventstream processors are used within the eventstream pipeline to apply transformations, filters (like your "City = 'Kansas'" logic), or aggregations before writing to a destination. This is the most efficient and low-code way to apply this filter.

13d2a97
Apr 19, 2025

Data processor: An eventstream with a custom endpoint Eventstreams allow real-time processing of data using processors, and a custom endpoint provides flexibility in routing data downstream, especially after transformations or filters. Filtering: An eventstream processor Eventstream processors are used within the eventstream pipeline to apply transformations, filters (like your "City = 'Kansas'" logic), or aggregations before writing to a destination. This is the most efficient and low-code way to apply this filter.