Deep Learning Week 3 Nptel Assignment Answers
Course Name: Deep Learning
Table of Contents
Course Link: Click Here
NPTEL Deep Learning Week 3 Assignment 3 Answers (Jan-Apr 2025)
- Find the distance of the 3D point, P = (-3, 1, 3) from the plane defined by
a. 3.1
b. 3.5
c. 0
d. ∞ (infinity)
- What is the shape of the loss landscape during optimization of SVM?
a. Linear
b. Paraboloid
c. Ellipsoidal
d. Non-convex with multiple possible local minimum
- How many local minima can be encountered while solving the optimization for maximizing margin for SVM?
a. 1
b. 2
c. ∞ (infinite)
d. 0
- Which of the following classifiers can be replaced by a linear SVM?
a. Logistic Regression
b. Neural Networks
c. Decision Trees
d. None of the above
- Find the scalar projection of vector b = <-2, onto vector a =
a. 0
b. 4/√ 5
c. 2/√ 17
d. -2/17
- For a 2-class problem, what is the minimum possible number of support vectors? Assume there are more than 4 examples from each class.
a. 4
b. 1
c. 2
d. 8
- Which one of the following is a valid representation of hinge loss (of margin = 1) for a two-class problem?
yy = class label (+1 or -1), pp = predicted (not normalized to denote any probability) value for a class.
a. L(y,p)=max(0,1−yp)
b. L(y,p)=min(0,1−yp)
c. L(y,p)=max(0,1+yp)
d. None of the above
- Suppose we have one feature x∈Rx \in R and binary class yy. The dataset consists of 3 points: p1:(x1,y1)=(−1,−1)p_1: (x_1, y_1) = (-1, -1), p2:(x2,y2)=(1,1)p_2: (x_2, y_2) = (1,1), p3:(x3,y3)=(3,1)p_3: (x_3, y_3) = (3,1). Which of the following is true with respect to SVM?
a. Maximum margin will increase if we remove the point p2p_2 from the training set.
b. Maximum margin will increase if we remove the point p3p_3 from the training set.
c. Maximum margin will remain the same if we remove the point p2p_2 from the training set.
d. None of the above.
- If we employ SVM to realize two-input logic gates, then which of the following will be true?
a. The weight vector for AND gate and OR gate will be the same.
b. The margin for AND gate and OR gate will be the same.
c. Both the margin and weight vector will be the same for AND gate and OR gate.
d. None of the weight vector and margin will be the same for AND gate and OR gate.
- What will happen to the margin length of a max-margin linear SVM if one non-support vector training example is removed?
a. Margin will be scaled down by the magnitude of that vector.
b. Margin will be scaled up by the magnitude of that vector.
c. Margin will be unaltered.
d. Cannot be determined from the information provided.
Course Link: Click Here
NPTEL Deep Learning Week 3 Assignment 3 Answers 2023
Q1. A data point with 5-dimension [27, 40, -15, 30, 38] obtains a score [18, 20, -5, -15, 19]. Find the hinge loss incurred by second class (class-2) with a margin (A) of 5.
a. 37
b. 7
c. 3
d. 120
Answer: b. 7
Q2. What is the shape of the loss landscape during optimization of SVM?
a. Linear
b. Paraboloid
c. Ellipsoidal
d. Non-convex with multiple possible local minimum
Answer: b. Paraboloid
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
Q3. How many local minimum can be encountered while solving the optimization for maximizing margin for SVM?
a. 1
b. 2
c. ∞ (infinite)
d. 0
Answer: a. 1
Q4. Which of the following classifiers can be replaced by a linear SVM?
a. Logistic Regression
b. Neural Networks
c. Decision Trees
d. None of the above
Answer: a. Logistic Regression
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
Q5. Consider a 2-class [y= {-1, 1}] classification problem of 2 dimensional feature vectors. The support vectors and the corresponding class label and lagrangian multipliers are provided. Find the value of SVM weight matrix W?
X₁=(-1,1), y₁=-1, α₁=2
X2=(0,3), y2=1, A₂=1
X3=(0,-1), y3-1, α3=1
a. (-1,3)
b. (2,0)
c. (-2,4)
d. (-2,2)
Answer: b. (2,0)
Q6. For a 2-class problem what is the minimum possible number of support vectors. Assume there are more than 4 examples from each class?
a. 4
b. 1
c. 2
d. 8
Answer: c. 2
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
Q7. A Support Vector Machine defined by WTX + b = 0, with support vectors x, and corresponding Lagrangian multipliers a, and the class value is y₁. Which of the following is true.
a. W = Σίνα,Χ
b. Σίγα; = 0
c. L = Σια; – {Σ;Σ;yy,aja,x, x,
d. All of the above
Answer: d. All of the above
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
Q8. Suppose we have one feature x E R and binary class y. The dataset consists of 3 points : p1 : (x1, y1) = (-1, −1), p2 : (x2, y2) = (1, 1), p3 : (x3, y3) = (3, 1). Which of the following true with respect to SVM?
a. Maximum margin will increase if we remove the point p2 from the training set.
b. Maximum margin will increase if we remove the point p3 from the training set.
c. Maximum margin will remain same if we remove the point p2 from the training set.
d. None of the above.
Answer: a. Maximum margin will increase if we remove the point p2 from the training set.
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
Q9. If we employ SVM to realize two input logic gates, then which of the following will be true?
a. The weight vector for AND gate and OR gate will be same.
b. The margin for AND gate and OR gate will be same.
c. Both the margin and weight vector will be same for AND gate and OR gate.
d. None of the weight vector and margin will be same for AND gate and OR gate.
Answer: b. The margin for AND gate and OR gate will be same.
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
Q10. The values of Lagrange multipliers corresponding to the support vectors can be:
a. Less than zero
b. Greater than zero
c. Any real number
d. Any non zero number.
Answer: b. Greater than zero
These are NPTEL Deep Learning Week 3 Assignment 3 Answers
More weeks of Deep Learning: Click Here
More Nptel Courses: https://progiez.com/nptel










