Data Mining Week 3 Nptel Assignment Answers
Are you looking for Data Mining Week 3 Nptel Assignment Answers ? You’ve come to the right place! Access the most accurate answers at Progiez.
Table of Contents

Data Mining Week 3 Nptel Assignment Answers (Jan-Apr 2025)
Course Link: Click Here
- Decision tree is an algorithm for:
a. Classification
b. Clustering
c. Association rule mining
d. Noise filtering
- Leaf nodes of a decision tree correspond to:
a. Attributes
b. Classes
c. Data instances
d. None of the above
- Non-leaf nodes of a decision tree correspond to:
a. Attributes
b. Classes
c. Data instances
d. None of the above
- Which of the following criteria is used to decide which attribute to split next in a decision tree:
a. Support
b. Confidence
c. Entropy
d. Scatter
These are Data Mining Week 3 Nptel Assignment Answers
- If we convert a decision tree to a set of logical rules, then:
a. The internal nodes in a branch are connected by AND and the branches by AND
b. The internal nodes in a branch are connected by OR and the branches by OR
c. The internal nodes in a branch are connected by AND and the branches by OR
d. The internal nodes in a branch are connected by OR and the branches by AND
- The purpose of pruning a decision tree is:
a. Improving training set classification accuracy
b. Improving generalization performance
c. Dimensionality reduction
d. Tree balancing
- Given the following training set for classification problem into two classes “fraud” and “normal”, splitting on which attribute in the root of a decision tree will lead to highest information gain?
A1 | A2 | Class |
---|---|---|
1 | 0 | fraud |
1 | 1 | fraud |
1 | 1 | fraud |
1 | 0 | normal |
1 | 1 | fraud |
0 | 0 | normal |
0 | 0 | normal |
0 | 0 | normal |
1 | 1 | normal |
1 | 0 | normal |
a. A1
b. A2
c. There will be a tie among the attributes
d. Not enough information to decide
- Given the following training set, the entropy is:
A1 | A2 | Class |
---|---|---|
1 | 0 | fraud |
1 | 1 | fraud |
1 | 1 | fraud |
1 | 0 | normal |
1 | 1 | fraud |
0 | 0 | normal |
0 | 0 | normal |
0 | 0 | normal |
1 | 1 | normal |
1 | 0 | normal |
a. 0
b. –(4/10)xlog(4/10)-(6/10)xlog(6/10)
c. –log(4/10)-log(6/10)
d. 1
- Given the following training set, splitting on attribute A1 in the root leads to an entropy reduction of:
A1 | A2 | Class |
---|---|---|
1 | 0 | fraud |
1 | 1 | fraud |
1 | 1 | fraud |
1 | 0 | normal |
1 | 1 | fraud |
0 | 0 | normal |
0 | 0 | normal |
0 | 0 | normal |
1 | 1 | normal |
1 | 0 | normal |
a. –log(4/10)-log(6/10)
b. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (4/7)xlog(4/7) + (3/7)xlog(3/7)
c. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (4/7)xlog(4/7) + (3/7)xlog(3/7) + 1
d. 1
- Given the following training set, splitting on attribute A2 in the root leads to an entropy reduction of:
A1 | A2 | Class |
---|---|---|
1 | 0 | fraud |
1 | 1 | fraud |
1 | 1 | fraud |
1 | 0 | normal |
1 | 1 | fraud |
0 | 0 | normal |
0 | 0 | normal |
0 | 0 | normal |
1 | 1 | normal |
1 | 0 | normal |
a. –log(4/10)-log(6/10)
b. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (1/4)xlog(1/4) + (3/7)xlog(3/7)
c. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (3/4)xlog(3/4) + (1/4)xlog(1/4) + (1/6)xlog(1/6) +(5/6)xlog(5/6)
d. 1
- Decision trees can be used for:
a. Classification only
b. Regression only
c. Both classification and regression
d. Neither of classification and regression
Data Mining Week 3 Nptel Assignment Answers
For answers to others Nptel courses, please refer to this link: NPTEL Assignment