Natural Language Processing (NLP) MCQ's




Question 751 :
Cohesion Bounds Text Together. Consider The Following Piece Of Text Yesterday, My Friend Invited Me To Her House. When I Reached, My Friend Was Preparing Coffee. Her Father Was Cleaning Dishes. Her Mother Was Busy Writing A Book. Each Occurance In The Above Text Refers To Which Noun Phrase?


  1. Me
  2. Friend'S Father
  3. Friend'S Mother
  4. My Friend'S
  

Question 752 :
Nlp Is Concerned With The Interactions Between


  1. Computers And Human (Natural) Languages.
  2. Machine And Machine
  3. Human And Machine
  4. Both A) And B)
  

Question 753 :
Rule-based POS taggers doesnt possess which of the following properties


  1. The rules in Rule-based POS tagging are built auto
  2. These taggers are knowledge-driven taggers
  3. These taggers are consist of many hand written rules
  4. The information is coded in the form of rules.
  

Question 754 :
The words that pronouns refer back to are called as __________.


  1. Antecedent
  2. Context
  3. Reference
  4. Speech act
  

Question 755 :
The reference to an entity that has been previously introduced into the sentence is called as __________ .


  1. discourse
  2. anaphora
  3. co refer
  4. referent
  

Question 756 :
In the sentence, 'He ate the pizza', the BOLD part is an example of _____.


  1. Noun phrase
  2. Verb phrase
  3. Prepositional phrase
  4. Adverbial phrase
  

Question 757 :
which one of the following is not Tools/Techniques that can be used with sentiment analysis


  1. SentiWordNet
  2. Latent semantic analysis
  3. Latent semantic analysis
  4. Abstractive analysis
  

Question 758 :
Which class words are limited in number


  1. Open class
  2. Closed class
  3. Tree bank
  4. Dictionary
  

Question 759 :
Main reason for tokenization


  1. It is simplest process
  2. Processing on word can be easily performed
  3. Almost all algorithms of tokenization executes in polynomial time
  4. Readymade program are available in various programming language
  

Question 760 :
Which type of ambiguity is present in the sentence Old men and women were taken to safe locations?


  1. Attachment ambiguity
  2. Scope Ambiguity
  3. Discourse ambiguity
  4. Semantics Ambiguity
  

Question 761 :
Suppose we want to calculate a probability for the sequence of observations {‘Dry’,’Rain’}. If the following are the possible hidden state sequences, then P(‘Dry’,‘Rain’) = ______. Transition probabilities: P(‘Low’|‘Low’)=0.3 , P(‘High’|‘Low’)=0.7 P(‘Low’|‘High’)=0.2, P(‘High’|‘High’)=0.8 • Observation probabilities : P(‘Rain’|‘Low’)=0.6 , P(‘Dry’|‘Low’)=0.4 P(‘Rain’|‘High’)=0.4 , P(‘Dry’|‘High’)=0.3 • Initial probabilities: P(‘Low’)=0.4 , P(‘High’)=0.6


  1. 0.1748
  2. 0.2004
  3. 0.1208
  4. 0.2438
  

Question 762 :
Beverage = coffee - tea - shake, is example of ______


  1. Meronymy
  2. Hyponymy
  3. Polysemy
  4. Clines
  

Question 763 :
Given a sentence S=w1 w2 w3 ... wn, to compute the likelihood of S using a bigram model. How would you compute the likelihood of S?


  1. Calculate the conditional probability of each word in the sentence given the preceding word and add the resulting numbers
  2. Calculate the conditional probability of each word in the sentence given the preceding word and multiply the resulting numbers
  3. Calculate the conditional probability of each word given all preceding words in a sentence and add the resulting numbers
  4. Calculate the conditional probability of each word given all preceding words in a sentence and multiply the resulting numbers
  

Question 764 :
CFG consist of


  1. Set of rules
  2. Set of productions
  3. Order of elements
  4. rules ,productions,order of element
  

Question 765 :
What is the right order for a text classification model components 1. Text cleaning 2. Text annotation 3. Gradient descent 4. Model tuning 5. Text to predictors


  1. 12345
  2. 13425
  3. 12534
  4. 13452
  

Question 766 :
Which of the following techniques is most appropriate to get root of word without considering word syntax


  1. Stemming
  2. Lemmatization
  3. Stop word removal
  4. Rooting
  

Question 767 :
How many lexemes are there in following list.man,men,girls,girl,mouse


  1. 4
  2. 5
  3. 3
  4. 2
  

Question 768 :
Which is the most suitable tecnhiqe for finding Trendning Topic on Twitter?


  1. Term Frequncy
  2. NER
  3. Tokenization
  4. Segmentation
  

Question 769 :
Which is not types of antonyms


  1. Polar antonyms
  2. Equipollent antonyms
  3. Overlapping antonyms
  4. Unipolar antonyms
  

Question 770 :
Classifying email as a spam, labelling WebPages based on their content, voice recognition are the example of _____.


  1. Supervised learning
  2. Unsupervised learning
  3. Machine learning
  4. Deep learning
  
Pages