Exam AZ-305 All QuestionsBrowse all questions from this exam
Question 70

You have an Azure subscription that contains an Azure Blob Storage account named store1.

You have an on-premises file server named Server1 that runs Windows Server 2016. Server1 stores 500 GB of company files.

You need to store a copy of the company files from Server1 in store1.

Which two possible Azure services achieve this goal? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

    Correct Answer: B, C

    An Azure Import/Export job allows you to securely import large amounts of data to Azure Blob Storage by shipping hard disk drives to an Azure data center. This is suitable for transferring 500 GB of data. Azure Data Factory is a cloud-based data integration service that can create, schedule, and manage data pipelines. It can copy data from your on-premises file server to Azure Blob Storage using a Self-hosted Integration Runtime on your server, making it a suitable solution for data transfer.

Discussion
EltoothOptions: BC

B & C are correct

Eltooth

https://docs.microsoft.com/en-gb/azure/storage/blobs/storage-blobs-introduction#move-data-to-blob-storage

sw1000Options: BC

A. an Azure Logic Apps integration account no, this is an integration service with visual flows with If-Then style logic. It does not support a way to import data from on-premise to blobstorage B. an Azure Import/Export job Agree, with other people here. C. Azure Data Factory Agree, is a way of importing data, but looking at 500GB it is a bit of overkill D. an Azure Analysis services On-premises data gateway not a data import option E. an Azure Batch account Is part of Azure Batch service and involve HPC job scheduling etc. but is not a way of importing or exporting data from on-premise to Azure Note: For 500GB we would probably use AzCopy instead. If it was a Typo and actually 500TB we would use Azure Data Box Heavy or maybe the Azure Import/Export Service if you provide your own drives.

memyself2

This was a question was on my exam today (2/26/23) - Scored 844 I agree with this answer

memo454

This question is on today's exam. The exam is easier than AZ-104.

NotMeAnyWayOptions: BC

B. an Azure Import/Export job C. Azure Data Factory B. Azure Import/Export job: This service allows you to securely import or export large amounts of data to or from Azure Blob Storage by shipping hard disk drives to an Azure data center. You can use the Azure Import/Export service to transfer the company files from your on-premises server to the Azure Blob Storage account. C. Azure Data Factory: It is a cloud-based data integration service that enables you to create, schedule, and manage data pipelines. You can create a pipeline in Azure Data Factory to copy data from your on-premises file server to Azure Blob Storage. You will need to use a Self-hosted Integration Runtime installed on your on-premises server to facilitate the data movement between your on-premises server and Azure Blob Storage.

iamhyumi

Got this on Sept. 5, 2023

JimmyYopOptions: BC

appeared in Exam 01/2024

nav109Options: BC

Got this on Nov. 17, 2023

zellckOptions: BC

BC is the answer. https://learn.microsoft.com/en-gb/azure/storage/blobs/storage-blobs-introduction#move-data-to-blob-storage A number of solutions exist for migrating existing data to Blob Storage: - Azure Data Factory supports copying data to and from Blob Storage by using the account key, a shared access signature, a service principal, or managed identities for Azure resources. - The Azure Import/Export service provides a way to import or export large amounts of data to and from your storage account using hard drives that you provide.

stonwall12Options: BC

Correct Answer - B & C: Azure Import/Export & Azure Data Factory Azure Import/Export: - This is used for transferring large amounts of data to and from Azure Blob, File, and Disk storage using physical hard drives. It would be suitable for transferring 500 GB of data. https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-service Azure Data Factory: - Azure Data Factory is a cloud-based data integration service that can move and integrate data from various sources to various destinations. It would be suitable for copying files from Server1 to Blob Storage. https://learn.microsoft.com/en-us/azure/data-factory/introduction

lvzOptions: BC

ok, I will go with ADF, however I dont see question mentioning the connectivity between on-prem and Azure AD. I think ADF can only be used when on-prem is connected with Azure AD.

ukivanlamlpiOptions: BE

files is not fit for data factory

AdventureChick

Data Factory can move files. It isn't just for DBs. I accidentally upvoted this when I went to click reply.

ThaitaniumOptions: BC

B and C

23169fdOptions: BC

B. an Azure Import/Export job Why: You can use the Azure Import/Export service to securely transfer large amounts of data to Azure Blob Storage by shipping hard drives to an Azure data center. C. Azure Data Factory Why: Azure Data Factory can be used to create a data pipeline that moves files from on-premises to Azure Blob Storage, enabling automated and scheduled transfers.

23169fd

Why Not Other Options: A. Azure Logic Apps integration account: Designed for integrating workflows and not typically used for bulk data transfer. D. Azure Analysis Services On-premises data gateway: Used for accessing on-premises data sources from Azure Analysis Services, not for transferring files to Blob Storage. E. Azure Batch account: Intended for running large-scale parallel and batch compute jobs, not for transferring files to Blob Storage.

LazylinuxOptions: BC

Given answer is correct

BShelatOptions: BC

Well B & C seem to be the answers. For B, though windows 2016 is NOT a supported version based on following link. https://learn.microsoft.com/en-us/azure/import-export/storage-import-export-requirements

TomdeBom

I think those OS requirements where only meant to describe older versions of Windows that are still support (I know, this is bad documentation form MS part, but MS Learn is far from perfect, documentation wise.) The support is about the waimportexport.exe tool used. https://learn.microsoft.com/en-us/previous-versions/azure/storage/common/storage-import-export-tool-preparing-hard-drives-import#requirements-for-waimportexportexe states Windows 7, Windows Server 2008 R2, or a newer Windows operating system are supported!

totalz

LOL, I see sarcasm in the voted answers. Or may be it's just me seeing the question differently~

totalz

I mean B is possible, but a really stupid solution unless there's a typo, it's actually 500TB! My answer are B & E. My real life choice is Azure File storage.