MLS-C01 Exam QuestionsBrowse all questions from this exam

MLS-C01 Exam - Question 44


A Machine Learning Specialist has created a deep learning neural network model that performs well on the training data but performs poorly on the test data.

Which of the following methods should the Specialist consider using to correct this? (Choose three.)

Show Answer
Correct Answer: BCF

When a model performs well on the training data but poorly on the test data, it is likely experiencing overfitting. To combat overfitting, several strategies can be employed. Increasing regularization helps by adding a penalty term to the loss function, which discourages the model from fitting too closely to the training data's noise. Increasing dropout involves randomly dropping neurons during training, which forces the model to develop more robust features that are not reliant on specific neurons. Decreasing feature combinations reduces the complexity of the model, limiting the chances of overfitting by simplifying the relationships the model learns. Hence, the most appropriate actions to take are to increase regularization, increase dropout, and decrease feature combinations.

Discussion

17 comments
Sign in to comment
cybe001
Sep 26, 2021

Yes, answer is BCF

Phong
Oct 7, 2021

Go for BCF

AjoseOOptions: BCF
Feb 10, 2023

Increasing regularization helps to prevent overfitting by adding a penalty term to the loss function to discourage the model from learning the noise in the data. Increasing dropout helps to prevent overfitting by randomly dropping out some neurons during training, which forces the model to learn more robust representations that do not depend on the presence of any single neuron. Decreasing the number of feature combinations helps to simplify the model, making it less likely to overfit.

obaidur
Oct 29, 2021

BCF F explained in AWS document: Feature selection: consider using fewer feature combinations, decrease n-grams size, and decrease the number of numeric attribute bins. Increase the amount of regularization used https://docs.aws.amazon.com/machine-learning/latest/dg/model-fit-underfitting-vs-overfitting.html

Shailendraa
Sep 8, 2022

BCF - Always remember in case of overfitting - reduce features, Add regularisation and increase dropouts.

ahquiceno
Nov 2, 2021

BCE: The main objective of PCA (technic to feature combination) is to simplify your model features into fewer components to help visualize patterns in your data and to help your model run faster. Using PCA also reduces the chance of overfitting your model by eliminating features with high correlation. https://towardsdatascience.com/dealing-with-highly-dimensional-data-using-principal-component-analysis-pca-fea1ca817fe6

uninit
Jan 28, 2023

AWS Documentation explicitly mentions reducing feature combinations to prevent overfitting - https://docs.aws.amazon.com/machine-learning/latest/dg/model-fit-underfitting-vs-overfitting.html It's B C F

jackzhao
Mar 15, 2023

BCF is correct.

ninomfr64Options: BCF
Jun 21, 2024

I think here the point is around the definition of "feature combinations". If you refer to it as "combine the features to generate a smaller but more effective feature set" this would end up to a smaller feature set thus a good thing for overfitting. However, if you refer to it as "combine the features to generate additional features" this would end up to a larger feature set thus a bad thing for overfitting. Also, in some cases you implement feature combinations in your model (see hidden layers in feed-forward network) thus increasing model complexity which is bad for overfitting. To me this question is poorly worded. I would pick F as my best guess is that you need to implement feature combination in your model, thus decreasing feature combination decrease complexity hence improving with overfitting issue

cloudera3
Jul 9, 2024

Great callout - what exactly the Feature combination is performing has not been elaborated It can be: Using PCA or t-SNE, it is essentially optimizing features - good to address overfitting, and should be done Or, it can be: Using Cartesian Product, features are being combined to create additional features - this will aid overfitting and should NOT be done. Wish questions and answer options are written clearly so that there is no room for ambiguity. Especially, taking into account that in real life, these kind of communication/write-up will trigger follow-up questions until addressed satisfactorily.

apnu
Oct 30, 2021

BCF 100%

CloudTrail
Nov 1, 2021

B/C/F Easy peasy.

TomatoteacherOptions: BCE
Jan 15, 2023

I see all the comments for BCF, although when you look at F it just says decrease 'feature combinations', not features themselves. In one way to decrease feature combinations results in having more features (less feature engineering), which in turn will cause more overfitting. Unless the question in badly worded, saying less feature combinations just mean those combinations, which components will not be used, then it has to be BCE.

AjoseO
Feb 10, 2023

Increasing the number of feature combinations can sometimes improve the performance of a model if the model is underfitting the data. However, in this context, it is not likely to be a solution to overfitting.

cpal012
Apr 3, 2023

Decrease feature combinations - too many irrelevant features can influence the model by drowning out the signal with noise

SRB1337
Jun 21, 2023

its BCF

kaike_reisOptions: BCF
Jul 31, 2023

B-C-F. All of those options can be used to reduce model complexity and thus: overfit

akgarg00Options: BCE
Nov 7, 2023

BCE The model has learnt training data. One approach is to increase complexity by increasing the features or remove some features to increase bias. In deep learning, i thinking increasing feature set is more workable.

Neet1983Options: BCF
Dec 30, 2023

Best choices are B (Increase regularization), C (Increase dropout), and F (Decrease feature combinations), as these techniques are effective in reducing overfitting and improving the model's ability to generalize to new data.

Piyush_NOptions: BCF
Mar 3, 2024

If your model is overfitting the training data, it makes sense to take actions that reduce model flexibility. To reduce model flexibility, try the following: Feature selection: consider using fewer feature combinations, decrease n-grams size, and decrease the number of numeric attribute bins. Increase the amount of regularization used. https://docs.aws.amazon.com/machine-learning/latest/dg/model-fit-underfitting-vs-overfitting.html

Denise123Options: BCE
Apr 3, 2024

About option E: When increasing feature combinations, the goal is not to simply add more features indiscriminately, which could indeed lead to overfitting. Instead, it involves selecting and combining features in a way that captures important patterns and relationships in the data. When done effectively, increasing feature combinations can help the model generalize better to unseen data by providing more informative and discriminative features, thus reducing the risk of overfitting.