Exam DP-203 All QuestionsBrowse all questions from this exam
Question 367

HOTSPOT -

You need to design a data ingestion and storage solution for the Twitter feeds. The solution must meet the customer sentiment analytics requirements.

What should you include in the solution? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

    Correct Answer:

    Box 1: Configure Evegent Hubs partitions

    Scenario: Maximize the throughput of ingesting Twitter feeds from Event Hubs to Azure Storage without purchasing additional throughput or capacity units.

    Event Hubs is designed to help with processing of large volumes of events. Event Hubs throughput is scaled by using partitions and throughput-unit allocations.

    Incorrect Answers:

    ✑ Event Hubs Dedicated: Event Hubs clusters offer single-tenant deployments for customers with the most demanding streaming needs. This single-tenant offering has a guaranteed 99.99% SLA and is available only on our Dedicated pricing tier.

    ✑ Auto-Inflate: The Auto-inflate feature of Event Hubs automatically scales up by increasing the number of TUs, to meet usage needs.

    Event Hubs traffic is controlled by TUs (standard tier). Auto-inflate enables you to start small with the minimum required TUs you choose. The feature then scales automatically to the maximum limit of TUs you need, depending on the increase in your traffic.

    Box 2: An Azure Data Lake Storage Gen2 account

    Scenario: Ensure that the data store supports Azure AD-based access control down to the object level.

    Azure Data Lake Storage Gen2 implements an access control model that supports both Azure role-based access control (Azure RBAC) and POSIX-like access control lists (ACLs).

    Incorrect Answers:

    ✑ Azure Databricks: An Azure administrator with the proper permissions can configure Azure Active Directory conditional access to control where and when users are permitted to sign in to Azure Databricks.

    ✑ Azure Storage supports using Azure Active Directory (Azure AD) to authorize requests to blob data.

    You can scope access to Azure blob resources at the following levels, beginning with the narrowest scope:

    - An individual container. At this scope, a role assignment applies to all of the blobs in the container, as well as container properties and metadata.

    - The storage account. At this scope, a role assignment applies to all containers and their blobs.

    - The resource group. At this scope, a role assignment applies to all of the containers in all of the storage accounts in the resource group.

    - The subscription. At this scope, a role assignment applies to all of the containers in all of the storage accounts in all of the resource groups in the subscription.

    - A management group.

    Reference:

    https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-features https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control

Discussion
noobprogrammer

Answer looks correct to me: 1)Configure Event Hubs partition - The description says: "Maximize the throughput of ingesting Twitter feeds from Event Hubs to Azure Storage without purchasing additional throughput or capacity units." 2) An Azure Data Lake Storage Gen2 account. Databricks cluster has nothing to do with storage, and a Data lake fits the needs

Deeksha1234

correct!

Algasibiur

For me the correct answer is: 1. Azure Event Hubs Dedicated: In most streaming scenarios, data is lightweight, typically less than 1 MB, and requires high throughput. ref: https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-dedicated-overview#supports-streaming-large-messages 2. An Azure Data Lake Storage Gen2 account

kkk5566

correct

Lotusss

Box one is Enable auto-inflate Box two Data Lake account

NamitSehgal

Enable auto-inflate is for scale up Azure Event Hubs throughput units, so given answer by xamtopics are correct.