Deep Learning | Week 4

Course Name: Deep Learning

Course Link: Click Here

These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q1. A given cost function is of the form J(0) = 202-40+2? What is the weight update rule for gradient descent optimization at step t+1? Consider, a=0.01 to be the learning rate.
a. 8t+1=0 -0.01 (201)
b. 0+1 = 0 + 0.01 (20)
c. Ot+1=0t (201)
d. 0+1 = 0 0.04(0 – 1)

Answer: d. 0+1 = 0 0.04(0 – 1)


Q2. Which of the following activation function leads to sparse activation maps?
a. Sigmoid
b. Tanh
c. Linear
d. ReLU

Answer: d. ReLU


These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q3. If yi is the ground truth label for ith training data, p, is the predicted label. For a binary class classification, which of the following is a feasible loss function for training a neural net. C is the number of training data
a. -E-1 yi log p₁ – E-1(1 – y₁) log(1 – p₁)
b. -E1Y₁ log p₁ – E-₁ yi log(1 – pt) гс
c. E 1 yi log(1 – P₁) – Σ₁(1 – y₁) log(p;)
d. E (1-y₁) log p₁ – E-₁ yi log(1 – p₁)

Answer: a. -E-1 yi log p₁ – E-1(1 – y₁) log(1 – p₁)


Q4. Which logic function cannot be performed using a single-layered Neural Network?
a. AND
b. OR
c. XOR
d. All

Answer: c. XOR


These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q5. Which of the following options closely relate to the following graph? Green cross are the samples of Class-A while mustard rings are samples of Class-B and the red line is the separating line between the two class.

image 44

a. High Bias
b. Zero Bias
c. Zero Bias and High Variance
d. Zero Bias and Zero Variance

Answer: a. High Bias


Q6. Which of the following statement is true?
a. L2 regularization lead to sparse activation maps
b. L1 regularization lead to sparse activation maps
c. Some of the weights are squashed to zero in L2 regularization
d. L2 regularization is also known as Lasso

Answer: b. L1 regularization lead to sparse activation maps


These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q7. Which among the following options give the range for a tanh function?
a. -1 to 1
b. -1 to 0
c. 0 to 1
d. 0 to infinity

Answer: a. -1 to 1


These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q8. Consider the following neural network shown in the figure with inputs x1, x2 and output Y. The inputs take the values x₁, x₁ € {0,1}. The logical operation performed by the network is

image 45

a. AND
b. OR
c. XOR
d. NOR

Answer: d. NOR


These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q9. When is gradient descent algorithm certain to find a global minima?
a. For convex cost plot
b. For concave cost plot
c. For union of 2 convex cost plot
d. For union of 2 concave cost plot

Answer: a. For convex cost plot


These are NPTEL Deep Learning Week 4 Assignment 4 Answers


Q10. Let X=[-1, 0, 3, 5] be the the input of ith layer of a neural network. On this, we want to apply softmax function. What should be the output of it?
a. [0.368, 1, 20.09, 148.41]
b. [0.002, 0.006, 0.118,0.874]
c. [0.3, 0.05,0.6,0.05]
d. [0.04,0,0.06,0.9]

Answer: b. [0.002, 0.006, 0.118,0.874]



These are NPTEL Deep Learning Week 4 Assignment 4 Answers

More weeks of Deep Learning: Click Here

More Nptel Courses: https://progiez.com/nptel


These are NPTEL Deep Learning Week 4  Assignment 4 Answers
The content uploaded on this website is for reference purposes only. Please do it yourself first.