Data Mining Week 5 Nptel Assignment Answers
Are you looking for Data Mining Week 5 Nptel Assignment Answers ? You’ve come to the right place! Access the most accurate answers at Progiez.
Table of Contents

Data Mining Week 5 Nptel Assignment Answers (Jan-Apr 2025)
Course Link: Click Here
- Support vector machine may be termed as:
A. Maximum apriori classifier
B. Maximum margin classifier
C. Minimum apriori classifier
D. Minimum margin classifier
View Answer
- In a hard margin support vector machine:
A. No training instances lie inside the margin
B. All the training instances lie inside the margin
C. Only a few training instances lie inside the margin
D. None of the above
View Answer
- If the hyperplane WTX + b = 0 correctly classifies all the training points (Xi, yi), where yi = {+1, -1}, then:
A. ||W-1|| = 2
B. X = 1
C. WTXi + b ≥ 0 for all i
D. yi(WTXi + b) ≥ 0 for all i
View Answer
- The constraint in the primal optimization problem solved to obtain the hard margin optimal separating hyperplane is:
A. yi(WTXi + b) ≥ 1 for all i
B. yi(WTXi + b) ≤ 1 for all i
C. (WTXi + b) ≥ 1 for all i
D. (WTXi + b) ≤ 1 for all i
View Answer
- The constraint in the dual optimization problem solved to obtain the hard margin optimal separating hyperplane is:
A. yi(WTXi + b) ≥ 1 for all i
B. yi(WTXi + b) ≤ 1 for all i
C. αi ≥ 0, for all i, αi are the Lagrange multipliers
D. αi ≤ 0, for all i, αi are the Lagrange multipliers
View Answer
- In a hard margin SVM, support vectors lie –
A. inside the margin
B. on the margin
C. outside the margin
D. can be either inside or outside the margin
View Answer
- Hessian matrix considered in SVM design has elements of the form:
A. Xi . Xj
B. yi – yj
C. yiyj(Xi – Xj)T(Xi – Xj)
D. yiyjXi . Xj
View Answer
- The dual optimization problem in SVM design is usually solved using:
A. Genetic programming
B. Neural programming
C. Dynamic programming
D. Quadratic programming
View Answer
- The generalization constant C is used to tune the:
A. test error only
B. training error only
C. relative weightage to training and test error
D. none of the above
View Answer
- In a hard margin SVM WTX + b = 0, suppose Xj’s are the support vectors and αj’s the corresponding Lagrange multipliers, then which of the following statements are correct:
A. W = Σ αj yj Xj
B. Σ αj yj = 0
C. Either A or B
D. Both A and B
View Answer
Data Mining Week 5 Nptel Assignment Answers
For answers to others Nptel courses, please refer to this link: NPTEL Assignment