Exam SAA-C03 All QuestionsBrowse all questions from this exam
Question 139

A reporting team receives files each day in an Amazon S3 bucket. The reporting team manually reviews and copies the files from this initial S3 bucket to an analysis S3 bucket each day at the same time to use with Amazon QuickSight. Additional teams are starting to send more files in larger sizes to the initial S3 bucket.

The reporting team wants to move the files automatically analysis S3 bucket as the files enter the initial S3 bucket. The reporting team also wants to use AWS Lambda functions to run pattern-matching code on the copied data. In addition, the reporting team wants to send the data files to a pipeline in Amazon SageMaker Pipelines.

What should a solutions architect do to meet these requirements with the LEAST operational overhead?

    Correct Answer: D

    The solution with the least operational overhead involves using S3 replication to automatically copy files from the initial S3 bucket to the analysis S3 bucket. By configuring the analysis S3 bucket to send event notifications to Amazon EventBridge, you can then set up an ObjectCreated rule in EventBridge and configure both the Lambda function for pattern-matching and the SageMaker Pipelines as targets for this rule. This setup leverages managed services to minimize custom implementation and maintenance, making it the most efficient and automated approach.

Discussion
Six_Fingered_JoseOption: D

i go for D here A and B says you are copying the file to another bucket using lambda, C an D just uses S3 replication to copy the files, They are doing exactly the same thing while C and D do not require setting up of lambda, which should be more efficient The question says the team is manually copying the files, automatically replicating the files should be the most efficient method vs manually copying or copying with lambda.

vipyodha

yes d because of least operational overhead and also s3 event notification can only send to sns.sqs.and lambda , not to sagemaker.eventbridge can send to sagemaker

Abdou1604

but the reporting team also wants to use AWS Lambda functions to run pattern-matching code on the copied , S3 replication cons is copying everything

pentium75

The Lambda functions should run "on the copied data", so first copy, THEN run Lambda function, which is achieved by D.

123jhl0Option: B

C and D aren't answers as replicating the S3 bucket isn't efficient, as other teams are starting to use it to store larger docs not related to the reporting, making replication not useful. As Amazon SageMaker Pipelines, ..., is now supported as a target for routing events in Amazon EventBridge, means the answer is B https://aws.amazon.com/about-aws/whats-new/2021/04/new-options-trigger-amazon-sagemaker-pipeline-executions/

KADSM

Not sure how far lambda will cope up with larger files with the timelimit in place.

byteb

"The reporting team wants to move the files automatically to analysis S3 bucket as the files enter the initial S3 bucket." Replication is asynchronous, with lambda the data will be available faster. So I think A is the answer.

JayBee65

I think you are mis-interpreting the question. I think you need to use all files, including the ones provided by other teams, otherwise how can you tell what files to copy? I think the point of this statement is to show that more files are in use, and being copied at different times, rather than suggesting you need to differentiate between the two sources of files.

jdr75

You misinterpret it ... the reporting team is overload, cos' more teams request their services uploading more data to the bucket. That's the reason reporting team need to automate the process. So ALL the bucket objects need to be copied to other bucket, and the replication is better an cheaper than use Lambda. So the answer is D.

vipyodha

but B is not least operational overhead , D is least operational overhead

AntonioMinolfiOption: D

Utilizing a lambda function would introduce additional operational overhead, eliminating options A and B. S3 replication offers a simpler setup and efficiently accomplishes the task. S3 notifications cannot use SageMaker as a destination; the permissible destinations include SQS, SNS, Lambda, and Eventbridge, so C is out.

studynoplayOption: D

https://docs.aws.amazon.com/AmazonS3/latest/userguide/notification-how-to-event-types-and-destinations.html#supported-notification-destinations S3 can NOT send event notification to SageMaker. This rules out C. you have to send to • Amazon EventBridge 1st then to SageMaker

eendeeOption: D

Why I believe it is not C? The key here is in the s3:ObjectCreated:"Put". The replication will not fire the s3:ObjectCreated:Put. event. See link here: https://aws.amazon.com/blogs/aws/s3-event-notification/

Guru4CloudOption: D

Option D is the solution with the least operational overhead: Use S3 replication between buckets Send S3 events to EventBridge Add Lambda and SageMaker as EventBridge rule targets The reasons this has the least overhead: S3 replication automatically copies new objects to analysis bucket EventBridge allows easily adding multiple targets for events No custom Lambda function needed for copying objects Leverages managed services for event processing

jatricOption: D

S3 event can't be use to notify sagemaker, So C can't be right option. AB required lambda which is not unnecessary

lofzee

Answer is D because it requires least operational overhead and S3 replication does the copying for you. Also read this https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventNotifications.html Lambda and Sagemaker are not supported destinations for S3 Event Notifications

lofzee

Sorry i meant Sagemaker is not supported as an S3 Event Notification. Lambda is though. Still doesn't change what the answer is.... D

vijaykamalOption: D

Create lambda for replication is overhead. This dismisses A and B S3 event notification cannot be directed to Sagemaker directly. This dismisses C Correct Answer: D

fantastique007

You are right. This is the key point - Sagemaker cannot be the destination of S3 event notification.

MutiverseAgentOption: D

Correct: D B & D the only possible as Sagemaker is not supported as target for S3 events. Using bucket replication as D mention is more efficient than using a lambda as B mention.

cookieMrOption: D

Option D is correct because it combines S3 replication, event notifications, and Amazon EventBridge to automate the copying of files from the initial S3 bucket to the analysis S3 bucket. It also allows for the execution of Lambda functions and integration with SageMaker Pipelines. Option A is incorrect because it suggests manually copying the files using a Lambda function and event notifications, but it does not utilize S3 replication or EventBridge for automation. Option B is incorrect because it suggests using S3 event notifications directly with EventBridge, but it does not involve S3 replication or utilize Lambda for copying the files. Option C is incorrect because it only involves S3 replication and event notifications without utilizing EventBridge or Lambda functions for further processing.

gmehraOption: A

Answer is A The statement says move the file. Replication won't move the file it will just create a copy. so Obviously C and D are out. When you Event notification and Lambda why we need EVent bridge as more service. So answer is A

Kaireny54

A and B says : create a lambda function to COPY also. Then, folowing your idea, A and B are out too... ;) obviously move argument isn't accute in this question

markw92

I searched S3 documentation and couldn't find where s3 event notification can trigger sagemaker pipelines. It can SNS,SQS and lambda. I am not sure A is the right choice.

andyngkh86

I go for C, because option C no need to configure event notifications, but D need to extra work to configure the event notification, for the least operation, option C is best choice

Marco_StOption: D

B is the first option I denied. Since it makes the event happens inside the analysis bucket to trigger the lambda function. so if the lambda function is running code to copy files from initial bucket to analysis bucket. Then this lambda function should be triggered by the event in initial bucket like once the data reaches in the initial bucket then lambda is triggered. D is the answer.

TariqKipkemeiOption: D

D provide the least operational overhead

kraken21Option: D

D takes care of automated moving and lambda for pattern matching are covered efficiently in D.

SuketuKohli

only one destination type can be specified for each event notification in S3 event notifications