Introduction To Machine Learning IIT-KGP Nptel Week 6 Assignment Answers

Are you looking for Nptel Introduction To Machine Learning IIT-KGP Week 6 Answers 2024? This guide offers comprehensive assignment solutions tailored to help you master key machine learning concepts such as supervised learning, regression, and classification.

Course Link: Click Here

Introduction To Machine Learning IIT-KGP Nptel Week 6 Assignment Answers
Introduction To Machine Learning IIT-KGP Nptel Week 6 Assignment Answers

Introduction To Machine Learning IIT-KGP Week 6 Answers (July-Dec 2024)


Q1.The neural network given below takes two binary valued inputs x. x, € {0.1} and the
activation function is the binary threshold function (h(x) = 1if x > 0; 0 otherwise ). Which
of the following logical functions does it compute?

A) AND
B) OR
C) NAND
D) None of the above

Answer: A) AND


Q2‘What is the sequence of the following tasks in a perceptron?
I) Initialize the weights of the perceptron randomly.
II) Go to the next batch of data set.
III) If the prediction does not match the output, change the weights.
IV) For a sample input, compute an output.
A) I,II,III,IV
B) IV,III,II,I
C) III,I,II,IV
D) I,IV,III,II

Answer: D) I,IV,III,II


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


Q3.Suppose you have inputs as x, y. and z with values -2, 5, and -4 respectively. You have a
neuron ‘q’ and neuron ‘f* with functions:

What is the gradient of f with respect to x. y. and 2?
A) (-3,4,4)
B) (4,4,3)
C) (-4,-4,3)
D)(3,-4,-4)

Answer: C) (-4,-4,3)


Q4.For a fully-connected neural network with one hidden layer. what effect should increasing the
number of hidden units have on bias and variance?

A. Decrease bias, increase variance

B. Increase bias, increase variance

C. Increase bias, decrease variance

D. No effect

Answer: A. Decrease bias, increase variance


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


Q5.‘Which of the following is true about model capacity (where model capacity means the ability
of a neural network to approximate complex functions)?

A) As number of hidden layers increase, model capacity increases

B) As dropout ratio increases, model capacity increases

C) As learning rate increases, model capacity increases

D) None of these.

Answer: A) As number of hidden layers increase. model capacity increases


Q6 The back-propagation learning algorithm applied to a two layer neural network
A) always finds the globally optimal solution.
B) finds a locally optimal solution which may be globally optimal.
C) never finds the globally optimal solution.
D) finds a locally optimal solution which is never globally optimal

Answer: B) finds a locally optimal solution which may be globally optimal.


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


Q7.‘Which of the following gives non-linearity to a neural network
A) Gradient descent
B) Bias
C) Sigmoid Activation Function
D) None

Answer: C) Sigmoid Activation Function


Q8.The network that involves backward links from outputs to the inputs and hidden layers is called
as

A) Self-organizing Maps

B) Perceptron

C) Recurrent Neural Networks

D) Multi-Layered Perceptron

Answer: C) Recurrent Neural Networks


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


Q9.A Convolutional Neural Network(CNN) 1s a Deep Neural Network which can extract various
abstract features from an input required for a given task. Given are the operations performed
by a CNN on an input:
1) Max Pooling
2) Convolution Operation
3) Flatten
4) Forward propagation by Fully Connected Network
Identify the correct sequence of operations performed from the options below:
A) 4,3,2,.1
B) 2,1,3,4
C) 3,1,2,4
D) 4,2,1,3

Answer: B) 2,1,3,4


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


Q10.In training a neural network, we notice that the loss does not increase in the first few starting
epochs: What is the reason for this?

A) The learning Rate is low.

B) The Regularization Parameter is High.

C) Stuck at the Local Minima.

D) All of the above could be the reason.

Answer: D) All of the above could be the reason.


For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 6 Answers


All weeks of Introduction to Machine Learning: Click Here

More Nptel Courses: https://progiez.com/nptel-assignment-answers