DP-201 Exam QuestionsBrowse all questions from this exam

DP-201 Exam - Question 47


You are designing a streaming solution that must meet the following requirements:

✑ Accept input data from an Azure IoT hub.

✑ Write aggregated data to Azure Cosmos DB.

✑ Calculate minimum, maximum, and average sensor readings every five minutes.

✑ Define calculations by using a SQL query.

✑ Deploy to multiple environments by using Azure Resource Manager templates.

What should you include in the solution?

Show Answer
Correct Answer: C

C

Cosmos DB is ideally suited for IoT solutions. Cosmos DB can ingest device telemetry data at high rates.

Architecture -

Exam DP-201 Question 47

Data flow -

1. Events generated from IoT devices are sent to the analyze and transform layer through Azure IoT Hub as a stream of messages. Azure IoT Hub stores streams of data in partitions for a configurable amount of time.

2. Azure Databricks, running Apache Spark Streaming, picks up the messages in real time from IoT Hub, processes the data based on the business logic and sends the data to Serving layer for storage. Spark Streaming can provide real time analytics such as calculating moving averages, min and max values over time periods.

3. Device messages are stored in Cosmos DB as JSON documents.

Reference:

https://docs.microsoft.com/en-us/azure/architecture/solution-ideas/articles/iot-using-cosmos-db

Discussion

8 comments
Sign in to comment
suman13
Apr 8, 2021

ARM template can only be used to create databricks workspace but not for the application(notebooks),cluster etc. whereas it can be used for stream analytics. Hence Stream Analytics should be the answer here

KRV
Apr 4, 2021

Since the question clearly specifies to define calculations by using a SQL query as well as the points that minimum, maximum, and average sensor readings are to be calculated every five minutes using Windows functions within the Azure stream analytics would be a straight and preffered option. One can always use Azure Databricks , however for minimal code using SQL and windows functions , the best possible solution ideally should be Azure Stream Analytics.

cadio30
May 24, 2021

D. Azure Stream Analytics is the appropriate solution for the requirements

SK1984
Apr 3, 2021

Why not D. Azure Stream analytics ?

maynard13x8
Apr 6, 2021

I think because asa jobs queries are not exactly sql. If that si not te case

maynard13x8
Apr 6, 2021

I would also choose asa (Azure stream analytics).

maynard13x8
Apr 6, 2021

I would also choose asa (Azure stream analytics).

maciejt
Apr 4, 2021

Azure functions is also present in the architecture. Why incorrect answer then?

davem0193
Jun 19, 2021

The architectural diagram provided as part of the solution clearly shows that it needs to be databricks although Stream analytics makes more sense. Solution provided is correct - it is databricks

Anonymous
Jun 27, 2021

in the same page alternatives are provided and one is stream analytics. ARM template deploy of jobs are possible there. Where as DBR notebooks cannot be deployed through arm templates

Anonymous
Jun 27, 2021

In the link given there is an alternatives section which states for streaming Stream Analytics could be used as an alternative. So that is another plus to the argument that Stream Analytics should be the answer.

Ankush1994
Aug 15, 2021

Azure Databricks, running Apache Spark Streaming, picks up the messages in real time from IoT Hub, processes the data based on the business logic and sends the data to Serving layer for storage.