Users are spending excess time researching duplicate cases to determine whether to process or resolve the cases.
Which two options allow you to reduce the number of potential duplicate cases? (Choose two.)
Users are spending excess time researching duplicate cases to determine whether to process or resolve the cases.
Which two options allow you to reduce the number of potential duplicate cases? (Choose two.)
To reduce the number of potential duplicate cases, you can increase the weights of the weighted conditions and increase the weighted condition sum threshold. Increasing the weight of conditions makes the criteria for identifying duplicates more stringent. Similarly, increasing the sum threshold means that only cases which meet a higher combined weighting are considered potential duplicates. Together, these methods help in filtering out and reducing the number of possible duplicate cases that users need to evaluate.
to decrease the number of duplicates I would have thought you'd want to increase the threshold and decrease the weights to get less dups so A and C
BD is the right answer
BD. It effectively says users are examining too many possible duplicates themselves as Pega presents them as not duplicates. So, how do you make PEGA not the USERS evaluate more duplicates. So that would be decreasing the threshold sum and increasing the weights of the conditions.