# INTRODUCTION TO MACHINE LEARNING Week 9

**Session: JULY-DEC 2023**

**Course Name: Introduction to Machine Learning**

**Course Link: Click Here**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q1. Which of the following best describes the Markov property in a Hidden Markov Model (HMM)?**

The future state depends on the current state and the entire past sequence of states.

The future state depends only on the current state and is independent of the past states, given the current state.

The future state depends on the past states and the future states, given the current state.

The future state depends only on the past states and is independent of the current state.

**Answer: The future state depends only on the current state and is independent of the past states, given the current state.**

**Q2. Statement 1: Probability distributions are valid potential functions.Statement 2: Probability is always strictly positive.**

Statement 1 is true. Statement 2 is true. Statement 2 is the correct reason for statement 1.

Statement 1 is true. Statement 2 is true. Statement 2 is not the correct reason for statement 1.

Statement 1 is true. Statement 2 is false.

Both statements are false.

**Answer: Statement 1 is true. Statement 2 is false.**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q3. In the undirected graph given below, which nodes are conditionally independent of each other given B? Select all that apply.**

C, D

D, E

E, C

A, F

None of the above

**Answer: A [C, D], C [E, C]**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q4. Given graph below:Factorization is:**

p(x,y,z)=p(x)p(y|x)p(y|z)

p(x,y,z)=p(y)p(x|y)p(z|y)

p(x,y,z)=p(z)p(z|y)p(x|y)

p(x,y,z)=p(y)p(y|x)p(y|z)

**Answer: p(x,y,z)=p(y)p(x|y)p(z|y)**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q5. For the given graphical model, what is the optimal variable elimination order when trying to calculate P(E=e)?**

A, B, C, D

D, C, B, A

A, D, B, C

D, A, C, A

**Answer: A, B, C, D**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q6. Which of the following methods are used for calculating conditional probabilities? (more than one may apply)**

Viterbi algorithm

MAP inference

Variable elimination

Belief propagation

**Answer: C, D**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q7. In the undirected graph given below, which nodes are conditionally independent of each other given a single other node (may be different for different pairs)? Select all that apply.**

3, 2

0, 4

2, 5

1, 5

**Answer: A, D**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

More Weeks of INTRODUCTION TO MACHINE LEARNING: Click here

More Nptel Courses: Click here

**Session: JAN-APR 2023**

**Course Name: Introduction to Machine Learning**

**Course Link: Click Here**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q1. Consider the bayesian network shown below.**

**Two students – Manish and Trisha make the following claims:• Manish claims P(D|{S,L,C})=P(D|{L,C})• Trisha claims P(D|{S,L})=P(D|L)where P(X|Y) denotes probability of event X given Y. Please note that Y can be a set. Which of the following is true?**

a. Manish and Trisha are correct.

b. Manish is correct and Trisha is incorrect.

c. Manish is incorrect and Trisha is correct.

d. Both are incorrect.

e. Insufficient information to make any conclusion. Probability distributions of each variable should be given.

**Answer: b. Manish is correct and Trisha is incorrect.**

**Q2. Consider the same bayesian network shown in previous question (Figure 1). Two other students in the class – Trina and Manish make the following claims:• Trina claims P(S|{G,C})=P(S|C)• Manish claims P(L|{D,G})=P(L|G)Which of the following is true?**

a. Both the students are correct.

b. Trina is incorrect and Manish is correct.

c. Trina is correct and Manish is incorrect.

d. Both the students are incorrect.

e. Insufficient information to make any conclusion. Probability distributions of each variable should be given.

**Answer: a. Both the students are correct.**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q3. Consider the Bayesian graph shown below in Figure 2.**

**The random variables have the following notation: d – Difficulty, i – Intelligence, g – Grade, s -SAT, l – Letter. The random variables are modeled as discrete variables and the corresponding CPDs are as below.**

**What is the probability of P(i=1,d=0,g=2,s=1,l=1)?**

a. 0.004608

b. 0.006144

c. 0.001536

d. 0.003992

e. 0.009216

f. 0.007309

g. None of these

**Answer: e. 0.009216**

**Q4. Using the data given in the previous question, compute the probability of following assignment, P(i=1,g=1,s=1,l=0) irrespective of the difficulty of the course? (up to 3 decimal places)**

a. 0.160

b. 0.371

c. 0.662

d. 0.047

e. 0.037

f. 0.066

g. 0.189

**Answer: d. 0.047**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q5. Consider the Bayesian network shown below in Figure 3**

**Two students – Manish and Trisha make the following claims:• Manish claims P(H|{S,G,J})=P(H|{G,J})• Trisha claims P(H|{S,C,J})=P(H|{C,J})Which of the following is true?**

a. Manish and Trisha are correct.

b. Both are incorrect.

c. Manish is incorrect and Trisha is correct.

d. Manish is correct and Trisha is incorrect.

e. Insufficient information to make any conclusion. Probability distributions of each variable should be given.

**Answer: d. Manish is correct and Trisha is incorrect.**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q6. Consider the Markov network shown below in Figure 4**

**Which of the following variables are NOT in the markov blanket of variable “4” shown in the above Figure 4 ? (multiple answers may be correct)**

a. 1

b. 8

c. 2

d. 5

e. 6

f. 4

g. 7

**Answer: d, f, g**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q7. In the Markov network given in Figure 4, two students make the following claims:• Manish claims variable “1” is dependent on variable “7” given variable “2”.• Trina claims variable “2” is independent of variable “6” given variable “3”.Which of the following is true?**

a. Both the students are correct.

b. Trina is incorrect and Manish is correct.

c. Trina is correct and Manish is incorrect.

d. Both the students are incorrect.

e. Insufficient information to make any conclusion. Probability distributions of each variable should be given.

**Answer: d. Both the students are incorrect.**

**Q8. Four random variables are known to follow the given factorizationP(A1=a1,A2=a2,A3=a3,A4=a4)=1Zψ1(a1,a2)ψ2(a1,a4)ψ3(a1,a3)ψ4(a2,a4)ψ5(a3,a4)The corresponding Markov network would be**

**Answer: c.**

**Q9. Does there exist a more compact factorization involving less number of factors for the distribution given in previous question?**

a. Yes

b. No

c. Insufficient information

**Answer: a. Yes**

**Q10. Consider the following Markov Random Field.**

**Which of the following nodes will have no effect on H given the Markov Blanket of H?**

A

B

C

D

E

F

G

H

I

J

**Answer: C, E, F**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

More Weeks of Introduction to Machine Learning: Click Here

More Nptel courses: https://progiez.com/nptel

**Session: JUL-DEC 2022**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

Course Name: INTRODUCTION TO MACHINE LEARNING

Link to Enroll: Click Here

**Q1. In the undirected graph given below, which nodes are conditionally independent of each other given B? Select all that apply.**

a. A, D

b. D, E

c. C, D

d. A, F

e. None of the above

**Answer: a. A, D**

**Q2. In the modified undirected graph given below, which nodes are conditionally independent of each other given B? Select all that apply.**

a. C, D

b. D, E

c. E, C

d. A, F

e. None of the above

**Answer: c. E, C**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q3. In the undirected graph given below, how many terms will be there in its potential function factorization?**

a. 7

b. 3

c. 5

d. 9

e. None of the above

**Answer: b. 3**

**Q4. Which of these can be modeled as a HMM? Select all that apply.**

a. Machine translation

b. Speech recognition

c. Trajectory of a baseball

d. Fibonacci sequence

**Answer: d. Fibonacci sequence**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q5. HMMs are used for finding these. Select all that apply.**

a. Probability of a given observation sequence

b. All possible hidden state sequences given an observation sequence

c. Most probable observation sequence given the hidden states

d. Most probable hidden states given the observation sequence

**Answer: b. All possible hidden state sequences given an observation sequence**

**Q6. For the given graphical model, what is the optimal variable elimination order when trying to calculate P(E=e)?**

a. A, B, C, D

b. D, C, B, A

c. A, D, B, C

d. D, A, C, A

**Answer: d. D, A, C, A**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

**Q7. What is the tree width for the GM in previous question?**

a. 5

b. 4

c. 3

d. 2

**Answer: c. 3**

**Q8. Belief propagation is used for**

a. Calculating map estimate

b. Calculating joint probability

c. Calculating conditional marginal

d. None of the above

**Answer: b. Calculating joint probability**

**These are Introduction to Machine Learning Week 9 Assignment 9 Answers**

More NPTEL Solutions: https://progiez.com/nptel

The content uploaded on this website is for reference purposes only. Please do it yourself first. |