Deep Learning | Week 3

Course Name: Deep Learning

Course Link: Click Here

These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q1. A data point with 5-dimension [27, 40, -15, 30, 38] obtains a score [18, 20, -5, -15, 19]. Find the hinge loss incurred by second class (class-2) with a margin (A) of 5.
a. 37
b. 7
c. 3
d. 120

Answer: b. 7


Q2. What is the shape of the loss landscape during optimization of SVM?
a. Linear
b. Paraboloid
c. Ellipsoidal
d. Non-convex with multiple possible local minimum

Answer: b. Paraboloid


These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q3. How many local minimum can be encountered while solving the optimization for maximizing margin for SVM?
a. 1
b. 2
c. ∞ (infinite)
d. 0

Answer: a. 1


Q4. Which of the following classifiers can be replaced by a linear SVM?
a. Logistic Regression
b. Neural Networks
c. Decision Trees
d. None of the above

Answer: a. Logistic Regression


These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q5. Consider a 2-class [y= {-1, 1}] classification problem of 2 dimensional feature vectors. The support vectors and the corresponding class label and lagrangian multipliers are provided. Find the value of SVM weight matrix W?
X₁=(-1,1), y₁=-1, α₁=2
X2=(0,3), y2=1, A₂=1
X3=(0,-1), y3-1, α3=1

a. (-1,3)
b. (2,0)
c. (-2,4)
d. (-2,2)

Answer: b. (2,0)


Q6. For a 2-class problem what is the minimum possible number of support vectors. Assume there are more than 4 examples from each class?
a. 4
b. 1
c. 2
d. 8

Answer: c. 2


These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q7. A Support Vector Machine defined by WTX + b = 0, with support vectors x, and corresponding Lagrangian multipliers a, and the class value is y₁. Which of the following is true.
a. W = Σίνα,Χ
b. Σίγα; = 0
c. L = Σια; – {Σ;Σ;yy,aja,x, x,
d. All of the above

Answer: d. All of the above


These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q8. Suppose we have one feature x E R and binary class y. The dataset consists of 3 points : p1 : (x1, y1) = (-1, −1), p2 : (x2, y2) = (1, 1), p3 : (x3, y3) = (3, 1). Which of the following true with respect to SVM?
a. Maximum margin will increase if we remove the point p2 from the training set.
b. Maximum margin will increase if we remove the point p3 from the training set.
c. Maximum margin will remain same if we remove the point p2 from the training set.
d. None of the above.

Answer: a. Maximum margin will increase if we remove the point p2 from the training set.


These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q9. If we employ SVM to realize two input logic gates, then which of the following will be true?
a. The weight vector for AND gate and OR gate will be same.
b. The margin for AND gate and OR gate will be same.
c. Both the margin and weight vector will be same for AND gate and OR gate.
d. None of the weight vector and margin will be same for AND gate and OR gate.

Answer: b. The margin for AND gate and OR gate will be same.


These are NPTEL Deep Learning Week 3 Assignment 3 Answers


Q10. The values of Lagrange multipliers corresponding to the support vectors can be:
a. Less than zero
b. Greater than zero
c. Any real number
d. Any non zero number.

Answer: b. Greater than zero



These are NPTEL Deep Learning Week 3 Assignment 3 Answers

More weeks of Deep Learning: Click Here

More Nptel Courses: https://progiez.com/nptel


These are NPTEL Deep Learning Week 3 Assignment 3 Answers
The content uploaded on this website is for reference purposes only. Please do it yourself first.