Data Mining Week 4 Nptel Assignment Answers
Are you looking for Data Mining Week 4 Nptel Assignment Answers ? You’ve come to the right place! Access the most accurate answers at Progiez.
Table of Contents

Data Mining Week 4 Nptel Assignment Answers (Jan-Apr 2025)
Course Link: Click Here
Q1. Maximum a posteriori classifier is also known as:
a) Decision tree classifier
b) Bayes classifier
c) Gaussian classifier
d) Maximum margin classifier
Q2. If we are provided with an infinite-sized training set, which of the following classifiers will have the lowest error probability?
a) Decision tree
b) K-nearest neighbor classifier
c) Bayes classifier
d) Support vector machine
Q3. Let A be an example, and C be a class. The probability P(C|A) is known as:
a) Apriori probability
b) Aposteriori probability
c) Class conditional probability
d) None of the above
Q4. Let A be an example, and C be a class. The probability P(C) is known as:
a) Apriori probability
b) Aposteriori probability
c) Class conditional probability
d) None of the above
Q5. A bank classifies its customers into two classes, “fraud” and “normal,” based on their installment payment behavior. Given:
- P(fraud) = 0.20
- P(default) = 0.40
- P(default | fraud) = 0.80
What is the probability of a customer who defaults in payment being a fraud?
a) 0.80
b) 0.60
c) 0.40
d) 0.20
Q6. Consider two binary attributes X and Y. Given:
- P(X=1) = 0.6
- P(Y=0) = 0.4
What is the probability that both X and Y have values 1?
a) 0.06
b) 0.16
c) 0.26
d) 0.36
Q7. Consider a binary classification problem with two classes C1 and C2. Class labels of ten training set instances sorted in increasing order of their distance to an instance x are:
{C1, C2, C1, C2, C2, C2, C1, C2, C1, C2}.
How will a K=7 nearest neighbor classifier classify x?
a) There will be a tie
b) C1
c) C2
d) Not enough information to classify
Q8. Given the following training set for a classification problem into two classes “fraud” and “normal.”
A1 | A2 | Class |
---|---|---|
1 | 0 | fraud |
1 | 1 | fraud |
1 | 1 | fraud |
1 | 0 | normal |
1 | 1 | fraud |
0 | 0 | normal |
0 | 0 | normal |
0 | 0 | normal |
1 | 1 | normal |
1 | 0 | normal |
What is the estimated apriori probability P(fraud) of the class fraud?
a) 0.2
b) 0.4
c) 0.6
d) 0.8
Q9. Given the same training set as above, what is the estimated class conditional probability P(A1=1, A2=1 | fraud)?
a) 0.25
b) 0.50
c) 0.75
d) 1.00
Q10. The Bayes classifier classifies the instance (A1=1, A2=1) into which class?
a) fraud
b) normal
c) There will be a tie
d) Not enough information to classify
Data Mining Week 4 Nptel Assignment Answers
For answers to others Nptel courses, please refer to this link: NPTEL Assignment