Natural Language Processing Nptel Week 7 Quiz Answers

Are you looking for Natural Language Processing Nptel Week 7 Quiz Answers? You’ve come to the right place! Access the most accurate answers at Progiez.


Natural Language Processing Nptel Week 7 Quiz Answers
Natural Language Processing Nptel Week 7 Quiz Answers

Natural Language Processing Nptel Week 7 Quiz Answers (Jan-Apr 2025)

Course Link: Click Here


Que. 1
Suppose you have a raw text corpus and you compute a word co-occurrence matrix from there. Which of the following algorithm(s) can you utilize to learn word representations? (Choose all that apply)

a) CBOW
b) SVD
c) PCA
d) GloVe

View Answer


Que. 2
What is the method for solving word analogy questions like, given A, B, and D, find C such that A:B::C:D, using word vectors?

a) Vc=Va+(Vb−Va)V_c = V_a + (V_b – V_a), then use cosine similarity to find the closest word of VcV_c.
b) Vc=Va+(Vd−Vb)V_c = V_a + (V_d – V_b), then do dictionary lookup for VcV_c.
c) Vc=Va+(Va−Vp)V_c = V_a + (V_a – V_p), then use cosine similarity to find the closest word of VcV_c.
d) Vc=Va+(Va−Vb)V_c = V_a + (V_a – V_b), then do dictionary lookup for VcV_c.
e) None of the above

View Answer


Que. 3
What is the value of PMI(w1,w2)PMI(w_1, w_2) for C(w1)=100C(w_1) = 100, C(w2)=2500C(w_2) = 2500, C(w1,w2)=320C(w_1, w_2) = 320, N=50000N = 50000?

NN: Total number of documents.
C(w1)C(w_1): Number of documents where w1w_1 has appeared.
C(w1,w2)C(w_1, w_2): Number of documents where both words have appeared.

Note: Use base 2 in logarithm.

a) 4
b) 5
c) 6
d) 5.64

View Answer


Que. 4
Given two binary word vectors w1w_1 and w2w_2 as follows:

w1=[1010011010]w_1 = [1010011010]
w2=[0011111100]w_2 = [0011111100]

Compute the Dice and Jaccard similarity between them.

See also  Natural Language Processing Nptel Week 6 Quiz Answers

a) 6/11, 3/8
b) 10/11, 5/6
c) 4/9, 2/7
d) 5/9, 5/8

View Answer


Que. 5
Consider two probability distributions for two words pp and qq. Compute their similarity scores with KL-divergence.

p=[0.20,0.75,0.50]p = [0.20, 0.75, 0.50]
q=[0.90,0.10,0.25]q = [0.90, 0.10, 0.25]

Note: Use base 2 in logarithm.

a) 4.704, 1.720
b) 1.692, 0.553
c) 2.246, 1.412
d) 3.213, 2.426

View Answer


Que. 6
Consider the following word co-occurrence matrix given below. Compute the cosine similarity between:

(i) w1w_1 and w2w_2, and
(ii) w1w_1 and w3w_3.

w4w_4w5w_5w6w_6
w1w_1285
w2w_2497
w3w_3123

a) 0.773, 0.412
b) 0.881, 0.764
c) 0.987, 0.914
d) 0.897, 0.315

View Answer


Que. 7

‘Which ofthe folowing types of relations can be captured by word2vec (CBOW or
Skipgram)?

  1. Analogy (A:B::C:?)
  2. Antonymy
  3. Polysemy
  4. All of the above

View Answer


Natural Language Processing Nptel Week 7 Quiz Answers

For answers to others Nptel courses, please refer to this link: NPTEL Assignment