Certified Machine Learning Professional Exam QuestionsBrowse all questions from this exam

Certified Machine Learning Professional Exam - Question 10


Which of the following is a reason for using Jensen-Shannon (JS) distance over a Kolmogorov-Smirnov (KS) test for numeric feature drift detection?

Show Answer
Correct Answer: DE

Jensen-Shannon (JS) distance produces a value between 0 and 1 that represents the divergence between two distributions. This value can be interpreted directly and doesn't require setting arbitrary thresholds or cutoffs. In contrast, the Kolmogorov-Smirnov (KS) test often involves determining a critical value based on the chosen significance level.

Discussion

4 comments
Sign in to comment
Alishahab70Option: D
Feb 7, 2024

D is correct

ThoBustos
May 8, 2024

not sure about this one...

james_donquixoteOption: E
May 26, 2024

This is a key advantage of using Jensen-Shannon divergence. It produces a value between 0 and 1, which represents the divergence between two distributions. This value can be interpreted without needing to set arbitrary thresholds or cutoffs. In contrast, the KS test involves comparing the test statistic to a critical value, which can depend on the significance level chosen.

JackeyquanOption: D
Jun 28, 2024

D is the answer. For E, it's also need set a threshold