NLP Natural Language Processing MCQ's




Question 271 :
Given a set of unigram and bigram probabilities, what is the probability of the following sequence ‘ do Sam I like’ according to the bigram language model? P(do|) = 2/11, P(do|Sam) = 1/11, P(Sam|) = 4/11, P(Sam|do) = 1/8, P(I|Sam) = 4/11, P(Sam|I) = 2/9, P(I|do) = 2/8, P(I|like) = 2/7, P(like|I) = 3/11, P(do) = 3/8, P(Sam) = 2/11, P(I) = 4/11, P(like) = 5/11


  1. 3/11 * 2/11 * 4/11 * 5/11
  2. 2/11 * 1/8 * 4/11 * 3/11
  3. 2/11 * 1/11 * 2/9 * 2/7
  4. 2/11 + 1/11 + 2/9 + 2/7
  

Question 272 :
Discourse analysis is a part of ____________.


  1. Semantic Analysis
  2. Syntax Analysis
  3. Pragmatics
  4. Morphology
  

Question 273 :
The english words through and threw are examples of____________


  1. Automymy
  2. Polysemy
  3. Synonymy
  4. Homophony
  

Question 274 :
Video summarization extracts the most important frames from the _____ content


  1. Video
  2. Image
  3. Sound
  4. Doccument
  

Question 275 :
Classifying email as a spam, labelling WebPages based on their content, voice recognition are the example of _____.


  1. Supervised learning
  2. Unsupervised learning
  3. Machine learning
  4. Deep learning
  

Question 276 :
What Is The Major Difference Between Crf (Conditional Random Field) And Hmm (Hidden Markov Model)?


  1. Crf Is Generative Whereas Hmm Is Discriminative Model
  2. Crf Is Discriminative Whereas Hmm Is Generative Model
  3. Crf And Hmm Are Generative Model
  4. Crf And Hmm Are Discriminative Model
  

Question 277 :
How to use WordNet to measure semantic relatedness between words:


  1. Measure the shortest path between two words on WordNet
  2. Count the number of shared parent nodes
  3. Measure the difference between their depths in WordNet
  4. Measure the difference between the size of child nodes they have.
  

Question 278 :
In text mining, how the words ‘lovely’ is converted to ‘love’-


  1. By stemming
  2. By tokenization
  3. By lemmatization
  4. By rooting
  

Question 279 :
Number of states require to accept string ends with 10.


  1. 3
  2. 2
  3. 1
  4. can’t be represented.
  

Question 280 :
NLP Stands for.


  1. Natural Language Protocol
  2. Natural Lingual Protocol
  3. Natural Lingual Processing
  4. Natural Language Processing
  

Question 281 :
Deciding Insurance premium of a car based on online customers reviews is an application of ______________________.


  1. Information Retrival
  2. Information Extraction
  3. Sentiment Analysis
  4. Text Summarization
  

Question 282 :
Which Of The Text Parsing Techniques Can Be Used For Noun Phrase Detection, Verb Phrase Detection, Subject Detection, And Object Detection In Nlp.


  1. Part Of Speech Tagging
  2. Skip Gram And N-Gram Extraction
  3. Continuous Bag Of Words
  4. Dependency Parsing And Constituency Parsing
  

Question 283 :
In Semantic Analysis word embedding is used to _______


  1. Classify ambiguity in sentence
  2. Convert text data to numeric vector
  3. Feature Selection
  4. Feature Reduction
  

Question 284 :
Maximum Entropy Markov Model (MEMM) used to handle______


  1. Unkonwn word
  2. Known word
  3. Multpile tag
  4. Single tag word
  

Question 285 :
Which Mt Systems Involve Low Computational Costs And Can Be Extended Easily?


  1. Retrival-Based Mt
  2. Example-Based Mt
  3. Speech-Based Mt
  4. Interlingua-Based Mt
  

Question 286 :
GB theory dose not representation includes________-


  1. s-structure
  2. d-structure
  3. phonetic form.
  4. parsing
  

Question 287 :
The dish is displayed on the screen. Here the type of ambiguity is


  1. Phonetic
  2. Lexical
  3. Structural
  4. Semantic
  

Question 288 :
In the word desirability how many morpheme is present


  1. One
  2. Two
  3. Three
  4. Four
  

Question 289 :
Meaning Representation Bridges The Gap Between


  1. Linguistic - Commonsense Knowledge
  2. Dictionary - Special Knowledge
  3. Mother Tongue - Commonsense Knowledge
  4. Linguistic - Mother Tongue Knowledge
  

Question 290 :
Which of the following NLP tasks use sequential labeling technique?


  1. POS tagging
  2. Named entity recognition
  3. Speech recognition
  4. POS tagging - Named Entity Recognition - Speech recognition
  

Question 291 :
Word- Bass - as pitch Word- Bass as fish are


  1. Homophones
  2. Homograph
  3. Synonyms
  4. Antonyms
  

Question 292 :
In linguistic morphology, _____________ is the process for reducing inflected words to their root form.


  1. Rooting
  2. Stemming
  3. Text-Proofing
  4. Proofing
  

Question 293 :
A grammar that produces more than one parse tree for the same sentence is called as _______


  1. Contiguous
  2. Ambiguous
  3. Unambiguous
  4. Regular
  

Question 294 :
Which of these techniques is used for normalization in text mining?


  1. Rooting
  2. Stop words removal
  3. Removing stopwords
  4. Text wrapping
  

Question 295 :
Which of the following is merits of Context-Free Grammar?


  1. simplest style of grammar
  2. They are highly precise.
  3. High speed
  4. efficiency
  

Question 296 :
The words ‘there’ and ‘their’ causes which of the following type of ambiguity?


  1. Syntactic
  2. Semantic
  3. Phonological
  4. Pragmatic
  

Question 297 :
Which of the following is the major problem in Machine Translation?


  1. Referential Ambiguity
  2. Stop word
  3. Emoticons
  4. Proper Noun
  

Question 298 :
Which of the following is not a learning approach for QA system


  1. Unsupervised approach
  2. Supervised approach
  3. Knowledge based approach
  4. Sense disambiguation approach
  

Question 299 :
In this technique, content is extracted from the original data,but the extracted content is not modified in any way


  1. Extraction-based summarization
  2. Abstraction-based summarization
  3. Aided summarization
  4. Keyphrase extraction
  

Question 300 :
Mujhe khaanna khaanna hai. What will be tag of third word in the given sentence.


  1. Noun
  2. Verb
  3. Adverb
  4. Auxiliary verb
  
Pages