# An Introduction to Artificial Intelligence Week 9

**Course Name: An Introduction to Artificial Intelligence**

**Course Link: Click Here**

**These are An Introduction to Artificial Intelligence Answers Week 9**

**These are An Introduction to Artificial Intelligence Answers Week 9****Q1. Consider the following Bayesian network with binary variables.**

**Calculate the probability P(a | d, e) using rejection sampling, given the following samples. Return the answer as a decimal rounded to 2 decimal points (for example, if it is 0.333, return 0.33).**

**Answer: 0.75**

**Q2. Which of the following statements are true?**

a. Rejection sampling samples from the prior distribution

b. Rejection sampling samples from the posterior distribution

c. Likelihood sampling samples from the prior distribution

d. Likelihood sampling samples from the posterior distribution

**Answer: a. Rejection sampling samples from the prior distribution**

**These are An Introduction to Artificial Intelligence Answers Week 9**

**Q3. Which of the following properties are valid for the environment of the Turing Test?**

a. Fully observable

b. Multi-Agent

c. Dynamic

d. Stochastic

**Answer: b, c, d**

**Q4. Consider the following Bayesian Network. Suppose you are doing likelihood weighting to determine P(s|¬w,c).**

**What is the weight of the sample (c, s,r, ¬w)?Return the answer as a decimal rounded to 3 decimal points (for example, if it is 0.1234, return 0.123).**

**Answer: 0.005**

**These are An Introduction to Artificial Intelligence Answers Week 9**

**Q5. Suppose we use MCMC with Gibbs sampling to determine P(s|w) in the above problem. Which of the following are correct statements in this case?**

a. We might need to calculate P(w|¬s,c,r) during the sampling process.

b. The relative frequency of reaching the states with S assigned true after sufficiently many steps will provide an estimate of P(s|w)

c. We can get a reliable estimate of probability by using the first few samples only.

d. Sampling using MCMC is asymptotically equivalent to sampling from the prior probability distribution.

**Answer: b. The relative frequency of reaching the states with S assigned true after sufficiently many steps will provide an estimate of P(s|w)**

**Q6. Consider the following Bayesian Network. What is the Markov Blanket of C? Return the answer as a lexicographically sorted string (for example, if the blanket consists of the nodes A, D and C return ACD)**

**Answer: ABDEFG**

**These are An Introduction to Artificial Intelligence Answers Week 9**

**Q7. Which of the following provides a plausible way to learn the structure of Bayesian networks from data?**

a. Bayesian learning

b. Local search in the space of possible structures

c. MAP

d. MLE

**Answer: b. Local search in the space of possible structures**

**Q8. Consider the following Bayesian Network, where each variable is binary.**

**We have the following training examples for the above Bayesian net where two examples contain unobserved values (denoted by ?). All of the parameters of the Bayesian network are set at 0.5 initially, except for P(b) and P(c|¬a,¬b), which are initialised to 0.8. What is the value of P(c|a,b) after simulating the second M step of the simple (hard) EM algorithm? If the answer is the fraction m/n where m and n have no common factors, return m+n. (eg. 3 if the answer is 2/4)**

**Answer: 2**

**These are An Introduction to Artificial Intelligence Answers Week 9**

**Q9. Ram is given a possibly biased coin and he is supposed to estimate the probability of it turning heads. Ram tosses the coin 5 times and gets head 3 times. Suppose that Ram uses maximizing likelihood as the learning algorithm for this task. What, according to him, is the probability of getting heads for the coin?Give the answer rounded off to 1 decimal place**

**Answer: 0.6**

**Q10. Suppose that Ram has a prior that the probability of the coin turning head is one of 0.4 (case 1), 0.5 (case 2), 0.6 (case 3) with probability 1/3 each. Ram tosses the coin once and gets head. What is the posterior probability of case 1 given this observation?Give the answer rounded off to 2 decimal places.**

**Answer: 0.27**

**These are An Introduction to Artificial Intelligence Answers Week 9**

**These are An Introduction to Artificial Intelligence Answers Week 9**

More Solutions of An Introduction to Artificial Intelligence: Click Here

More NPTEL Solutions: https://progiez.com/nptel/