Exam Certified Machine Learning Professional All QuestionsBrowse all questions from this exam
Question 10

Which of the following is a reason for using Jensen-Shannon (JS) distance over a Kolmogorov-Smirnov (KS) test for numeric feature drift detection?

    Correct Answer: E

    Jensen-Shannon (JS) distance produces a value between 0 and 1 that represents the divergence between two distributions. This value can be interpreted directly and doesn't require setting arbitrary thresholds or cutoffs. In contrast, the Kolmogorov-Smirnov (KS) test often involves determining a critical value based on the chosen significance level.

Discussion
JackeyquanOption: D

D is the answer. For E, it's also need set a threshold

james_donquixoteOption: E

This is a key advantage of using Jensen-Shannon divergence. It produces a value between 0 and 1, which represents the divergence between two distributions. This value can be interpreted without needing to set arbitrary thresholds or cutoffs. In contrast, the KS test involves comparing the test statistic to a critical value, which can depend on the significance level chosen.

ThoBustos

not sure about this one...

Alishahab70Option: D

D is correct