Exam Certified Machine Learning Professional All QuestionsBrowse all questions from this exam
Question 40

A machine learning engineer needs to deliver predictions of a machine learning model in real-time. However, the feature values needed for computing the predictions are available one week before the query time.

Which of the following is a benefit of using a batch serving deployment in this scenario rather than a real-time serving deployment where predictions are computed at query time?

    Correct Answer: E

    Querying stored predictions can be faster than computing predictions in real-time. Since the feature values are available one week before the query time, predictions can be precomputed and stored using a batch serving deployment. This allows for quick retrieval of predictions when needed, as opposed to the potentially slower process of computing predictions on the fly in a real-time serving deployment. Additionally, batch processing can make efficient use of resources by performing computations during off-peak times.

Discussion
BokNinjaOption: E

The correct answer is E. Querying stored predictions can be faster than computing predictions in real-time. In this scenario, since the feature values needed for computing the predictions are available one week before the query time, the predictions can be precomputed using a batch serving deployment. When the predictions are needed, they can be quickly retrieved from storage, which can be faster than computing the predictions in real-time. This approach also allows for the efficient use of resources, as the computational work can be done during off-peak times.

hugodscarvalhoOption: E

In a batch serving deployment, predictions are computed offline, typically in bulk, using data that is available before the query time. These predictions are stored and can be quickly queried when needed, providing faster response times compared to computing predictions in real-time, especially when the feature values needed for computing predictions are available well in advance.