Professional Machine Learning Engineer Exam QuestionsBrowse all questions from this exam

Professional Machine Learning Engineer Exam - Question 58


You are working on a Neural Network-based project. The dataset provided to you has columns with different ranges. While preparing the data for model training, you discover that gradient optimization is having difficulty moving weights to a good solution. What should you do?

Show Answer
Correct Answer: B

When working with a neural network and encountering difficulties in gradient optimization, especially in the presence of features with different ranges, the appropriate step is to normalize the data. Normalization scales the dataset's numeric fields to a common range, typically between 0 and 1. This process ensures that features contribute equally to the gradient descent optimization, facilitating better and faster convergence towards an optimal solution. Therefore, using the representation transformation (normalization) technique is the appropriate approach in this scenario.

Discussion

16 comments
Sign in to comment
kurasakiOption: B
Jul 10, 2021

Vote for B. We could impute instead of remove the column to avoid loss of information

pdddddOption: B
Sep 28, 2021

I also think it is B: "The presence of feature value X in the formula will affect the step size of the gradient descent. The difference in ranges of features will cause different step sizes for each feature. To ensure that the gradient descent moves smoothly towards the minima and that the steps for gradient descent are updated at the same rate for all the features, we scale the data before feeding it to the model."

MK_AhsanOption: B
Jan 8, 2022

B. The problem does not mention anything about missing values. It needs to normalize the features with different ranges.

ggorzkiOption: B
Jan 19, 2022

normalization https://developers.google.com/machine-learning/data-prep/transform/transform-numeric

ralf_ccOption: B
Jul 10, 2021

B - remove the outliers?

omar_bh
Jul 16, 2021

Normalization is more complicated than that. Normalization changes the values of dataset's numeric fields to be in a common scale, without impacting differences in the ranges of values. Normalization is required only when features have different ranges.

kaike_reisOption: B
Nov 17, 2021

(B) - NN models needs features with close ranges - SGD converges well using features in [0, 1] scale - The question specifically mention "different ranges" Documentation - https://developers.google.com/machine-learning/data-prep/transform/transform-numeric

Y2DataOption: C
Sep 17, 2021

When gradient descent fails, it's out of the lacking of a powerful feature. Using normalization would make it worse. Instead, using either A or C would increase the strength of certain feature. But, C should come first since A is only feasible after at least 1 meaningful training. So C.

ares81Option: B
Jan 5, 2023

Normalization is the word.

fragkrisOption: B
Dec 5, 2023

B - The key phrase is "different ranges", therefore we need to normalize the values.

MultiCloudIronManOption: B
Apr 1, 2024

Because the range needs to normalize

PhilipKokuOption: B
Jun 6, 2024

B) Option B (Use the representation transformation technique) is the most relevant choice. Normalizing the features will help gradient descent converge efficiently, leading to better weight updates and improved model performance. Remember that feature scaling is crucial for gradient optimization, especially when dealing with features that have different ranges. By ensuring consistent scales, you’ll enhance the effectiveness of your Neural Network training process.

NamitSehgalOption: C
Jan 5, 2022

Looking at explanation I would choose C as well

hiromiOption: B
Dec 15, 2022

B "Normalization" is the keyword

ares81Option: C
Jan 5, 2023

Normalization is the word.

SergioRubianoOption: B
May 2, 2023

Normalization

M25Option: B
May 9, 2023

Went with B