AWS Certified Machine Learning Engineer - Associate MLA-C01 Exam QuestionsBrowse all questions from this exam

AWS Certified Machine Learning Engineer - Associate MLA-C01 Exam - Question 66


A company has used Amazon SageMaker to deploy a predictive ML model in production. The company is using SageMaker Model Monitor on the model. After a model update, an ML engineer notices data quality issues in the Model Monitor checks.

What should the ML engineer do to mitigate the data quality issues that Model Monitor has identified?

Show Answer
Correct Answer:

Discussion

5 comments
Sign in to comment
Ell89Option: D
Dec 31, 2024

the model needs to be retrained

GiorgioGssOption: C
Nov 28, 2024

If the problems start appearing "After a model update" then C is the only valid option.

SaransundarOption: C
Dec 4, 2024

Model Monitor gives data quality issues --> Create new baseline --> Validate baseline --> Update Model Monitor with new baseline --> Reevaluate data quality --> Investigate and fix root cause (if issues persist) --> Monitor continuously

Certified101Option: C
Feb 6, 2025

agree with GiorgioGss - If the problems start appearing "After a model update" then C is the only valid option.

eesaOption: C
Mar 21, 2025

Amazon SageMaker Model Monitor: Continuously monitors model endpoints to detect issues like: Data quality drift Model quality drift Bias drift Feature attribution drift Baseline: Model Monitor compares incoming data against a baseline dataset that represents "normal" or expected data distributions. If the data distribution changes after a model update, the old baseline may no longer be valid—leading to false positives in data quality issues.