# ML Deep Learning Fundamentals Applications Week 4 Answers

Are you searching for reliable ML Deep Learning Fundamentals Applications Week 4 Answers 2024? Look no further! Our solutions are designed to provide clear, detailed answers, helping you navigate your NPTEL course with confidence.

## ML Deep Learning Fundamentals Applications Week 4 Answers (July-Dec 2024)

1. How many decision boundaries are in one-vs-all classification?
A) ( c )
B) ( \frac{c(c−1)}{2} )
C) ( \frac{c(c+1)}{2} )
D) None of the above

2. To avoid the problem of an ambiguous region of a linear discriminant function for ( c ) categories, we can:
A) Define ( c ) linear functions ( g_i(x) ), one for each class for ( i = 1, 2, \dots, c )
B) Assign ( x ) to ( w_j ) if ( g_i(x) < g_j(x) ) for all ( i \neq j )
C) Take a linear machine classifier
D) All the above

3. Which of the following statements is true about the learning rate in Gradient Descent?
A) A very high learning rate may lead to oscillation
B) A lower learning rate may lead to faster convergence
C) The learning rate doesn’t determine the size of the steps taken towards the minimum
D) The learning rate has no effect on the convergence of Gradient Descent

4. In the Perceptron algorithm for a binary classifier, what happens to the weights when a positive misclassified point is encountered?
A) It remains the same
B) It is increased
C) It is decreased
D) It is multiplied by a constant

These are ML Deep Learning Fundamentals Applications Week 4 Answers

5. Let ( w_{ij} ) represent the weight between node ( i ) at layer ( k ) and node ( j ) at layer ( (k-1) ) of a given multilayer perceptron. The weight update using the gradient descent method is given by:
A) ( w_{ij}(t+1) = w_{ij}(t) + \alpha \frac{\partial E}{\partial w_{ij}}, \, 0 \leq \alpha \leq 1 )
B) ( w_{ij}(t+1) = w_{ij}(t) – \alpha \frac{\partial E}{\partial w_{ij}}, \, 0 \leq \alpha \leq 1 )
C) ( w_{ij}(t+1) = \alpha \frac{\partial E}{\partial w_{ij}}, \, 0 \leq \alpha \leq 1 )
D) ( w_{ij}(t+1) = -\alpha \frac{\partial E}{\partial w_{ij}}, \, 0 \leq \alpha \leq 1 )

These are ML Deep Learning Fundamentals Applications Week 4 Answers

6. A 4-input neuron has weights 3, 4, 5, and 6. The transfer function is linear with the constant of proportionality being equal to 3. The inputs are 6, 12, 10, and 20 respectively. What will be the output?
A) 238
B) 76
C) 708
D) 123

7. Which of these is true about discriminant classifiers?
A) Assume conditional independence of features
B) Robust to outliers
C) Can perform classification if some missing data points are present
D) All the above

8. A set of training samples is given below. Using Support Vector Machine algorithm, the Marginal line for the classification can be calculated as:
A) ( -5.32x_1 – 7.193x_2 + 9.09 = 0 )
B) ( -6.67x_1 + 8.134x_2 – 9.09 = 0 )
C) ( -7.21x_1 – 9.173x_2 + 9.09 = 0 )
D) ( 8.21x_1 + 7.12x_2 – 9.09 = 0 )

9. Referring to Question 8, a new test sample (0.5, 0.5) is found. The class of the given sample is:
A) Positive
B) Negative
C) Both classes
D) Can’t say

10. What is the main objective of a Support Vector Machine (SVM)?
A) To maximize the number of support vectors
B) To minimize the margin between classes
C) To maximize the training accuracy
D) To find a hyperplane that separates classes with the maximum margin