Introduction to Machine Learning Nptel Week 9 Answers

Are you looking for Introduction to Machine Learning Nptel Week 9 Answers? You’ve come to the right place! Access the latest and most accurate solutions for your Week 9 assignment in the Introduction to Machine Learning course.



Introduction to Machine Learning Nptel Week 9 Answers (Jan-Apr 2025)

Course Link: Click Here


1. Consider the Markov Random Field given below. We need to delete one edge (without deleting any nodes) so that in the resulting graph, B and F are independent given A. Which of these edges could be deleted to achieve this independence?

a) AC
b) BE
c) CE
d) AE

View Answer


2. Consider the Markov Random Field from question 1. We need to delete one node (and also delete the edges incident with that node) so that in the resulting graph, B and C are independent given A. Which of these nodes could be deleted to achieve this independence?

a) D
b) E
c) F
d) None of the above

View Answer


Introduction to Machine Learning Nptel Week 9 Answers


3. Consider the Markov Random Field from question 1. Which of the nodes has / have the largest Markov blanket (i.e. the Markov blanket with the most number of nodes)?

a) A
b) B
c) C
d) D
e) E
f) F

View Answer


4. Consider the Bayesian Network given below. Which of the following independence relations hold?

a) A and B are independent if C is given
b) A and B are independent if no other variables are given
c) C and D are not independent if A is given
d) A and F are independent if C is given

View Answer


5. In the Bayesian Network from question 4, assume that every variable is binary. What is the number of independent parameters required to represent all the probability tables for the distribution?

a) 8
b) 12
c) 16
d) 24
e) 36

View Answer


Introduction to Machine Learning Nptel Week 9 Answers


6. In the Bayesian Network from question 4, suppose variables A, C, E can take four possible values, while variables B, D, F are binary. What is the number of independent parameters required to represent all the probability tables for the distribution?

a) 24
b) 36
c) 48
d) 64
e) 84

View Answer


7. In the Bayesian Network from question 4, suppose all variables can take 4 values. What is the number of independent parameters required to represent all the probability tables for the distribution?

a) 72
b) 90
c) 108
d) 128
e) 144

View Answer


8. Consider the Bayesian Network from question 4. Which of the given options are valid factorizations to calculate the marginal P (E = e) using variable elimination (need not be the optimal order)?

a) ∑BP(B)∑AP(A)∑DP(D|A)∑CP(C|A,B)∑FP(E=e|C)P(F|C)
b) ∑AP(A)∑DP(D|A)∑BP(B)∑CP(C|A,B)∑FP(E=e|C)P(F|C)
c) ∑BP(B)∑AP(D|A)∑DP(A)∑FP(C|A,B)∑CP(E=e|C)P(F|C)
d) ∑AP(B)∑BP(D|A)∑DP(A)∑FP(C|A,B)∑CP(E=e|C)P(F|C)
e) ∑AP(A)∑BP(B)∑CP(C|A,B)∑DP(D|A)∑FP(E=e|C)P(F|C)

View Answer


Introduction to Machine Learning Nptel Week 9 Answers


9. Consider the MRF given below. Which of the following factorization(s) of P (a, b, c, d, e) satisfies/satisfy the independence assumptions represented by this MRF?

a) P(a,b,c,d,e) = (1/Z) ψ1(a,b,c,d)ψ2(b,e)
b) P(a,b,c,d,e) = (1/Z) ψ1(b)ψ2(a,c,d)ψ3(a,b,e)
c) P(a,b,c,d,e) = (1/Z) ψ1(a,b)ψ2(c,d)ψ3(b,e)
d) P(a,b,c,d,e) = (1/Z) ψ1(a,b)ψ2(c,d)ψ3(b,d,e)
e) P(a,b,c,d,e) = (1/Z) ψ1(a,c)ψ2(b,d)ψ3(b,e)
f) P(a,b,c,d,e) = (1/Z) ψ1(c)ψ2(b,e)ψ3(b,a,d)

View Answer


10. The following figure shows an HMM for three time steps i = 1, 2, 3. Suppose that it is used to perform part-of-speech tagging for a sentence. Which of the following statements is/are true?

a) The Xi variables represent parts-of-speech and the Yi variables represent the words in the sentence.
b) The Yi variables represent parts-of-speech and the Xi variables represent the words in the sentence.
c) The Xi variables are observed and the Yi variables need to be predicted.
d) The Yi variables are observed and the Xi variables need to be predicted.

View Answer


Introduction to Machine Learning Nptel Week 9 Answers

More Weeks of INTRODUCTION TO MACHINE LEARNING: Click here

More Nptel Courses: Click here