Certified Machine Learning Professional Exam QuestionsBrowse all questions from this exam

Certified Machine Learning Professional Exam - Question 21


Which of the following MLflow operations can be used to automatically calculate and log a Shapley feature importance plot?

Show Answer
Correct Answer: AC

The correct operation to use in MLflow for automatically calculating and logging a Shapley feature importance plot is mlflow.shap.log_explanation. This function computes and logs explanations of an ML model’s output using SHAP (SHapley Additive exPlanations), which includes generating and logging the SHAP summary bar plot showing the average impact of each feature on the model output.

Discussion

4 comments
Sign in to comment
random_data_guyOption: A
Dec 27, 2023

Answer A. https://mlflow.org/docs/latest/python_api/mlflow.shap.html#mlflow.shap.log_explanation "... computes and logs explanations of an ML model’s output. Explanations are logged as a directory of artifacts containing the following items generated by SHAP (SHapley Additive exPlanations). - Base values - SHAP values (computed using shap.KernelExplainer) - Summary bar plot (shows the average impact of each feature on model output)"

BokNinjaOption: C
Dec 19, 2023

C. mlflow.shap

mozucaOption: A
Dec 27, 2023

mlflow.shap: Automatically calculates and logs Shapley feature importance plots. # Generate and log SHAP plot for first 5 records mlflow.shap.log_explanation(rf.predict, X_train[:5])

hugodscarvalhoOption: A
Jan 27, 2024

C is partially correct. However, going with A since it contains the method itself. Using mlflow.shap.log_explanation, you can automatically calculate and log Shapley feature importance plots, providing insights into the importance of different features in the model's predictions.