Introduction To Machine Learning IIT-KGP Nptel Week 7 Assignment Answers

Are you looking for Nptel Introduction To Machine Learning IIT-KGP Week 7 Answers 2024? This guide offers comprehensive assignment solutions tailored to help you master key machine learning concepts such as supervised learning, regression, and classification.

Course Link: Click Here

Introduction To Machine Learning IIT-KGP Nptel Week 7 Assignment Answers
Introduction To Machine Learning IIT-KGP Nptel Week 7 Assignment Answers

Introduction To Machine Learning IIT-KGP Week 7 Answers (July-Dec 2024)


Q1.‘Which of the following options is / are correct regarding the benefits of ensemble model?

Better performance

More generalized model

Better interpretability

A) 1 and3
B) 2 and 3
C) 1 and2
D)1, 2 and 3

Answer: C) 1 and2


Q2‘In AdaBoost. we give more weights to points having been misclassified in previous iterations. Now. if we introduce a limit or cap on the weight that any point can take (for example. say we introduce a restriction that prevents any point’s weight from exceeding a value of 10). Which
among the following would be the effect of such a modification?

A) It will have no effect on the performance of the Adaboost method.

B) It makes the final classifier robust to outliers.

C) It may result in lower overall performance.

D) None of these.

Answer: B) It makes the final classifier robust to outliers.

C) It may result in lower overall performance.


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


Q3.Identify whether the following statement is true or false:

“Boosting is easy to parallelize whereas bagging is inherently a sequential process.”
A) True
B) False

Answer: B) False


Q4.Considering the AdaBoost algorithm. which among the following statements is true?

A) In each stage, we try to train a classifier which makes accurate predictions on a subset of the data points where the subset contains more of the data points which were misclassified in earlier stages.

B) The weight assigned to an individual classifier depends upon the weighted sum error of misclassified points for that classifier.

C) Both option A and B are true

D) None of them are true

Answer: C) Both option A and B are true


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 7 Answers


Q5.‘Which of the following is FALSE about bagging?
A) Bagging increases the variance of the classifier
B) Bagging can help make robust classifiers from unstable classifiers.
C) Majority Voting is one way of combining outputs from various classifiers which are being bagged.

See also  Introduction To Machine Learning - IITKGP | Week 1

Answer: A) Bagging increases the variance of the classifier


Q6 Suppose the VC dimension of a hypothesis space is 6. Which of the following are true?
A) Atleast one set of 6 points can be shattered by the hypothesis space.
B) Two sets of 6 points can be shattered by the hypothesis space.
C) All sets of 6 points can be shattered by the hypothesis space.
D) No set of 7 points can be shattered by the hypothesis space.

Answer: .A) Atleast one set of 6 points can be shattered by the hypothesis space.
D) No set of 7 points can be shattered by the hypothesis space.


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 7 Answers


Q7.Identify whether the following statement is true or false:

“Ensembles will yield bad results when there is a significant diversity among the models.”
A) True
B) False

Answer: B) False


Q8.Which of the following algorithms is not an ensemble learning algorithm?
A) Random Forest
B) Adaboost
C) Decision Trees

Answer: C) Decision Trees


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 7 Answers


Q9.Suppose you have run Adaboost on a training set for three boosting iterations. The results are classifiers hl, h2. and h3. with coefficients al = 0.2, 2 =—0.3, and a3 = —0.2. For a given test input x. you find that the classifiers results are h1(x) = 1. h2(x) = 1. and h3(x) = —1. What is the
class returned by the Adaboost ensemble classifier H on test example x?

A) 1

B) -1

Answer: A) 1


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 7 Answers


Q10.Generally, an ensemble method works better, if the individual base models have………………..? (Note: Individual models have accuracy greater than 50%)
A) Less correlation among predictions
B) High correlation among predictions
C) Correlation does not have an impact on the ensemble output
D) None of the above.

Answer: A) Less correlation among predictions


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 7 Answers


All weeks of Introduction to Machine Learning: Click Here

See also  Introduction To Machine Learning IIT-KGP Nptel Week 4 Assignment Answers

More Nptel Courses: https://progiez.com/nptel-assignment-answers


Introduction To Machine Learning IIT-KGP Week 7 Answers (July-Dec 2023)

Course Name: Introduction To Machine Learning IITKGP

Link to Enroll: Click Here


1) Which of the following option is/are correct regarding the benefits of ensemble model?
1. Better performance
2. More generalized model
3. Better interpretability

A) 1 and 3
B) 2 and 3
C) 1 and 2
D) 1, 2 and 3

Answer: C) 1 and 2


2) In AdaBoost, we give more weights to points having been misclassified in previous iterations. Now, if we introduced a limit or cap on the weight that any point can take (for example, say we introduce a restriction that prevents any point’s weight from exceeding a value of 10). Which among the following would be an effect of such a modification?
A) We may observe the performance of the classifier reduce as the number of stagesincrease.
B) It makes the final classifier robust to outliers.
C) It may result in lower overall performance.
D) None of these.

Answer: B, C


These are solutions for NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 7 Week 7


3) Which among the following are some of the differences between bagging and boosting?


These are solutions for NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 7 Week 7


A) In bagging we use the same classification algorithm for training on each sample of the data, whereas in boosting, we use different classification algorithms on the different training data samples.
B) Bagging is easy to parallelize whereas boosting is inherently a sequential process.
C) In bagging we typically use sampling with replacement whereas in boosting, we typically use weighted sampling techniques.
D) In comparison with the performance of a base classifier on a particular dataset, bagging will generally not increase the error whereas as boosting may leadto an increase in the error.

Answer: B, C, D


4) What is the VC-dimension of the class of sphere in a 3-dimensional plane?
A) 3
B) 4
C) 5
D) 6

Answer: A


These are solutions for NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 7 Week 7


5) Considering the AdaBoost algorithm, which among the following statements is true?

These are solutions for NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 7 Week 7

A) In each stage, we try to train a classifier which makes accurate predictions on anysubset of the data points where the subset size is at least half the size of the data set.
B) In each stage, we try to train a classifier which makes accurate predictions on a subset of the data points where the subset contains more of the data points whichwere misclassified in earlier stages.
C) The weight assigned to an individual classifier depends upon the number of data points correctly classified by the classifier.
D) The weight assigned to an individual classifier depends upon the weighted sumerror of misclassified points for that classifier.

See also  NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 5

Answer: B, D


6) Suppose the VC dimension of a hypothesis space is 6. Which of the following are true?
A) At least one set of 6 points can be shattered by the hypothesis space.
B) Two sets of 6 points can be shattered by the hypothesis space.
C) All sets of 6 points can be shattered by the hypothesis space.
D) No set of 7 points can be shattered by the hypothesis space.

Answer: A, D


These are solutions for NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 7 Week 7


7) Ensembles will yield bad results when there is a significant diversity among the models. Write True or False.
A) True
B) False

Answer: B) False


8) Which of the following algorithms are not an ensemble learning algorithm?
A) Random Forest
B) Adaboost
C) Gradient Boosting
D) Decision Tress

Answer: D) Decision Tress


These are solutions for NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 7 Week 7


9) Which of the following can be true for selecting base learners for an ensemble?
A) Different learners can come from same algorithm with different hyper parameters
B) ifferent learners can come from different algorithms
C) Different learners can come from different training spaces
D) All of the above.

Answer: D) All of the above.


10) Generally, an ensemble method works better, if the individual base models have _________?
Note : Individual models have accuracy greater than 50%

A) Less correlation among predictions
B) High correlation among predictions
C) Correlation does not have an impact on the ensemble output
D) None of the above.

Answer: A) Less correlation among predictions