Deep Learning Week 2 Assignment Answers Nptel

Course Name: Deep Learning

Course Link: Click Here


NPTEL Deep Learning Week 2 Assignment 2 Answers (Jan-Apr 2025)


1. Suppose if you are solving an n-class problem, how many discriminant function you will need
for solving?

a. n-1
b. n
C. n+1
d. n2

View Answer


  1. If we choose the discriminant function g;(x) as a function of posterior probability. i.e. g;(x) = f(p(w;/x)). Then which of following cannot be the function f( )?
    a. f(x) =a*,wherea>1
    b. f(x) =a™*,wherea>1
    c f(x)=2x+3
    d. f(x) =exp(x)

View Answer


  1. What will be the nature of the decision surface when the covariance matrices of different classes are identical but otherwise arbitrary? (Given all the classes have equal class probabilities)
    a. Always orthogonal to two surfaces
    b. Generally not orthogonal to two surfaces
    c. Bisector of the line joining two means, but not always orthogonal to two surfaces
    d. Arbitrary

View Answer


  1. The mean and variance of all the samples of two different normally distributed classes w1w_1 and w2w_2 are given. What will be the expression of the decision boundary between these two classes if both the classes have equal class probability 0.5? For the input sample xx, consider gt(x)=xg_t(x) = x
    a.
    b.
    c
    d

View Answer


  1. For a two-class problem, the linear discriminant function is given by g(x)=aTyg(x) = a^T y. What is the updating rule for finding the weight vector aa? Here, yy is an augmented feature vector.
    a. Adding the sum of all augmented feature vectors that are misclassified, multiplied by the learning rate, to the current weight vector
    b. Subtracting the sum of all augmented feature vectors that are misclassified, multiplied by the learning rate, from the current weight vector
    c. Adding the sum of all augmented feature vectors belonging to the positive class, multiplied by the learning rate, to the current weight vector
    d. Subtracting the sum of all augmented feature vectors belonging to the negative class, multiplied by the learning rate, from the current weight vector

View Answer


  1. For a minimum distance classifier, which of the following must be satisfied?
    a. All the classes should have identical covariance matrix and diagonal matrix
    b. All the classes should have identical covariance matrix but otherwise arbitrary
    c. All the classes should have equal class probability
    d. None of the above
See also  Deep Learning | Week 5

View Answer


  1. Which of the following is the updating rule of the gradient descent algorithm? Here, ∇\nabla is the gradient operator and η\eta is the learning rate.
    a. an+1=an−η∇F(an)
    b. an+1=an+η∇F(an)
    c. an+1=an−η∇F(an−1)
    d. an+1=an+ηa_(n-1)

View Answer


  1. The decision surface between two normally distributed classes w1w_1 and w2w_2 is shown in the figure. Can you comment on which of the following is true?
    a. —
    b. —
    c. —
    d. None of the above

View Answer


  1. In k-nearest neighbors algorithm (k-NN), how do we classify an unknown object?
    a. Assigning the label which is most frequent among the k nearest training samples
    b. Assigning the unknown object to the class of its nearest neighbor among training samples
    c. Assigning the label which is most frequent among all training samples except the k farthest neighbors
    d. None of the above

View Answer


  1. What is the direction of the weight vector ww with respect to the decision surface for a linear classifier?
    a. Parallel
    b. Normal
    c. At an inclination of 45°
    d. Arbitrary

View Answer


NPTEL Deep Learning Week 2 Assignment 2 Answers 2023


Q1. Suppose if you are solving a four class problem, how many discriminant function you will need for solving?
a. 1
b. 2
c. 3
d. 4

Answer: d. 4


Q2. Two random variable X1 and X2 follows Gaussian distribution with following mean and covariance.
X1~N (0, 3) and X2~N (0, 2).
Which of following will is true.

a. Distribution of X1 will be more flat than the distribution of X2.
b. Distribution of X2 will be more flat than the distribution of X1.
c. Peak of the both distribution will be same
d. None of above.

Answer: a. Distribution of X1 will be more flat than the distribution of X2.


These are NPTEL Deep Learning Week 2 Assignment 2 Answers


Q3. Which of the following is true with respect to the discriminant function for normal density.\?
a. Decision surface is always orthogonal bisector to two surfaces when the covariance matrices of different classes are identical but otherwise arbitrary
b. Decision surface is generally not orthogonal to two surfaces when the covariance matrices of different classes are identical but otherwise arbitrary
c. Decision surface is always orthogonal to two surfaces but not bisector when the covariance matrices of different classes are identical but otherwise arbitrary
d. Decision surface is arbitrary when the covariance matrices of different classes are identical but otherwise arbitrary

See also  Deep Learning | Week 4

Answer: b. Decision surface is generally not orthogonal to two surfaces when the covariance matrices of different classes are identical but otherwise arbitrary


Q4. In which of following case the decision surface intersect the line joining two means of two class at midpoint? (Consider class variance is large relative to the difference of two means)
a. When both the covariance matrices are identical and diagonal matrix.
b. When the covariance matrices for both the class are identical but otherwise arbitrary.
c. When both the covariance matrices are identical and diagonal matrix, and both the class has equal class probability.
d. When the covariance matrices for both class are arbitrary and different.

Answer: c. When both the covariance matrices are identical and diagonal matrix, and both the class has equal class probability.


These are NPTEL Deep Learning Week 2 Assignment 2 Answers


Q5. The decision surface between two normally distributed class w₁ and w2 is shown on the figure. Can you comment which of the following is true?

image 59

a. Σi = 621, where Σi is covariance matrix of class i
b. Σi = Σ, where Σ₁ is covariance matrix of class i
c. Σi = arbitary, where Σi is covariance matrix of class i
d. None of the above.

Answer: c. Σi = arbitary, where Σi is covariance matrix of class i


Q6. For minimum distance classifier which of the following must be satisfied?
a. All the classes should have identical covariance matrix and diagonal matrix.
b. All the classes should have identical covariance matrix but otherwise arbitrary.
c. All the classes should have equal class probability.
d. None of above.

Answer: c. All the classes should have equal class probability.


These are NPTEL Deep Learning Week 2 Assignment 2 Answers


Q7. You found your designed software for detecting spam mails has achieved an accuracy of 99%, i.e., it can detect 99% of the spam emails, and the false positive (a non-spam email detected as spam) probability turned out to be 5%. It is known that 50% of mails are spam mails. Now if an email is detected as spam, then what is the probability that it is in fact a non-spam email?
a. 5/104
b. 5/100
c. 4.9/100
d. 0.25/100

See also  Deep Learning IIT Ropar Week 8 Nptel Assignment Answers

Answer: a. 5/104


These are NPTEL Deep Learning Week 2 Assignment 2 Answers


Q8. Which of the following statements are true with respect to K-NN classifier?
1. In case of very large value of k, we may include points from other classes into the neighbourhood.
2. In case of too small value of k the algorithm is very sensitive to noise.
3. KNN classifier classify unknown samples by assigning the label which is most frequent among the k nearest training samples.

a. Statement 1 only
b. Statement 1 and 2 only
c. Statement 1, 2, and 3
d. Statement 1 and 3 only

Answer: c. Statement 1, 2, and 3


These are NPTEL Deep Learning Week 2 Assignment 2 Answers


Q9. You have given the following 2 statements, find which of these option is/are true in case of k NN?
4. In case of very large value of k, we may include points from other classes into the neighbourhood.
5. In case of too small value of k the algorithm is very sensitive to noise.

a. 1
b. 2
c. 1 and 2
d. None of this.

Answer: c. 1 and 2


These are NPTEL Deep Learning Week 2 Assignment 2 Answers


Q10. The decision boundary of linear classifier is given by the following equation.
4x₁ + 6x₂ – 11 = 0
What will be class of the following two unknown input example? (Consider class 1 as positive class, and class 2 as the negative class)
a2= [1, 2]
a2= [1,1]

a. a1 belongs to class 1, a2 belongs to class 2
b. a2 belongs to class 1, a1 belongs to class 2 belongs to class 2
c. a1 belongs to class 2, a2 belongs to class 2
d. a1 belongs to class 1, a2 belongs to class 1

Answer: a. a1 belongs to class 1, a2 belongs to class 2


These are NPTEL Deep Learning Week 2 Assignment 2 Answers



These are NPTEL Deep Learning Week 2 Assignment 2 Answers

More weeks of Deep Learning: Click Here

More Nptel Courses: https://progiez.com/nptel


These are NPTEL Deep Learning Week 2 Assignment 2 Answers
NPTEL Deep Learning Week 2 Assignment 2 Answers