Data Mining Week 3 Nptel Assignment Answers

Are you looking for Data Mining Week 3 Nptel Assignment Answers ? You’ve come to the right place! Access the most accurate answers at Progiez.


Data Mining Week 3 Nptel Assignment Answers
Data Mining Week 3 Nptel Assignment Answers

Data Mining Week 3 Nptel Assignment Answers (Jan-Apr 2025)

Course Link: Click Here


  1. Decision tree is an algorithm for:
    a. Classification
    b. Clustering
    c. Association rule mining
    d. Noise filtering

View Answer


  1. Leaf nodes of a decision tree correspond to:
    a. Attributes
    b. Classes
    c. Data instances
    d. None of the above

View Answer


  1. Non-leaf nodes of a decision tree correspond to:
    a. Attributes
    b. Classes
    c. Data instances
    d. None of the above

View Answer


  1. Which of the following criteria is used to decide which attribute to split next in a decision tree:
    a. Support
    b. Confidence
    c. Entropy
    d. Scatter

View Answer

These are Data Mining Week 3 Nptel Assignment Answers


  1. If we convert a decision tree to a set of logical rules, then:
    a. The internal nodes in a branch are connected by AND and the branches by AND
    b. The internal nodes in a branch are connected by OR and the branches by OR
    c. The internal nodes in a branch are connected by AND and the branches by OR
    d. The internal nodes in a branch are connected by OR and the branches by AND

View Answer


  1. The purpose of pruning a decision tree is:
    a. Improving training set classification accuracy
    b. Improving generalization performance
    c. Dimensionality reduction
    d. Tree balancing

View Answer


  1. Given the following training set for classification problem into two classes “fraud” and “normal”, splitting on which attribute in the root of a decision tree will lead to highest information gain?
See also  Data Mining Week 4 Nptel Assignment Answers
A1A2Class
10fraud
11fraud
11fraud
10normal
11fraud
00normal
00normal
00normal
11normal
10normal

a. A1
b. A2
c. There will be a tie among the attributes
d. Not enough information to decide

View Answer


  1. Given the following training set, the entropy is:
A1A2Class
10fraud
11fraud
11fraud
10normal
11fraud
00normal
00normal
00normal
11normal
10normal

a. 0
b. –(4/10)xlog(4/10)-(6/10)xlog(6/10)
c. –log(4/10)-log(6/10)
d. 1

View Answer


  1. Given the following training set, splitting on attribute A1 in the root leads to an entropy reduction of:
A1A2Class
10fraud
11fraud
11fraud
10normal
11fraud
00normal
00normal
00normal
11normal
10normal

a. –log(4/10)-log(6/10)
b. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (4/7)xlog(4/7) + (3/7)xlog(3/7)
c. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (4/7)xlog(4/7) + (3/7)xlog(3/7) + 1
d. 1

View Answer


  1. Given the following training set, splitting on attribute A2 in the root leads to an entropy reduction of:
A1A2Class
10fraud
11fraud
11fraud
10normal
11fraud
00normal
00normal
00normal
11normal
10normal

a. –log(4/10)-log(6/10)
b. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (1/4)xlog(1/4) + (3/7)xlog(3/7)
c. –(4/10)xlog(4/10)-(6/10)xlog(6/10) + (3/4)xlog(3/4) + (1/4)xlog(1/4) + (1/6)xlog(1/6) +(5/6)xlog(5/6)
d. 1

View Answer


  1. Decision trees can be used for:
    a. Classification only
    b. Regression only
    c. Both classification and regression
    d. Neither of classification and regression

View Answer


Data Mining Week 3 Nptel Assignment Answers

For answers to others Nptel courses, please refer to this link: NPTEL Assignment