Question 151 :
Which type of semantics is concerned with how words combine to form larger meanings
- Compund Semantics
- Compositional semantics
- Lexical semantics
- Word Semantics
Question 152 :
Parts-of-Speech tagging Does not determine ___________
- part-of-speech for each word dynamically as per meaning of the sentence
- part-of-speech for each word dynamically as per sentence structure
- all part-of-speech for a specific word given as input
- all part-of-speech for a specific stem from input
Question 153 :
Which of the following NLP problems can not be solved with Hidden Markov Model (HMM)?
- POS tagging
- Speech recognition
- Spelling correction
- Stemming
Question 154 :
The tour includes three Asian countries. Which is a noun phrase?
- The tour includes
- three Asian countries
- Three asian
- Tour includes
Question 155 :
Choose from the following where NLP is not being useful.
- Automatic Text Summarization
- Automatic Question-Answering Systems
- Partially Observable systems
- Information Retrieval
Question 156 :
Polysemy is defined as the coexistence of multiple meanings for a word or phrase in a text object. Which of the following models is likely the best choice to correct this problem?
- Random Forest Classifier
- Convolutional Neural Networks
- Gradient Boosting
- Keyword Hashing
Question 157 :
Probabilistic context- free grammar (PCFG) is also known as the __________
- Stochastic context-free grammar
- Context sensitive context-free grammar
- Regular grammar
- Unrestricted context free grammar
Question 158 :
Which of the following measurements are used to evaluate the quality of entity recognition?
- Recision
- Recall
- Measure
- R-measure
Question 159 :
_________ Is the Second Stage in NLP?
- Syntactic Analysis
- Discourse Analysis
- Semantic Analysis
- Pragmatic Analysis
Question 160 :
John and Mary love their Acuras. They drive them all the time. It is example of _______
- Indefinte noun pharse
- Definte noun pharse
- Demostrative
- Discontinuous sets
Question 161 :
What Was First Defined For Natural Language By Chomsky (1957)
- Context-Free Grammar (Cfg)
- Finite Automata (Fa)
- Push Down Automata (Pda)
- Turing Machine
Question 162 :
Capability vs Capabilities is an example of ______ morphology.
- Inflectional
- Normailzation
- Cliticization
- Derivational
Question 163 :
The metric to measure 'the intensity of emotion provoked by the stimulus' in emotion modeling is:
- Severity
- Valence
- Arousal
- Dominance
Question 164 :
Which semantic relation exists between the wordspiece and peace?
- Homophony
- Homonymy
- Hypernymy
- Meronymy
Question 165 :
The main aim of Natural Language Processing is to ____________ the human language.
- Cipher
- Index
- Understand
- Complicate
Question 166 :
Token and morpheme are always same.
- Yes
- NO
- Probability based
- Randomization based
Question 167 :
Which is not types of antonyms
- Polar antonyms
- Equipollent antonyms
- Overlapping antonyms
- Unipolar antonyms
Question 168 :
..............ambiguity refers to a situation where the context of a phrase gives it multiple interpretation
- Pragmatic
- Anaphoric
- Discourse
- Cataphoric
Question 169 :
Sentence Realization
- Syntactic Analysis
- Discourse Analysis
- Semantic Analysis
- Pragmatic Analysis
Question 170 :
Which are words that have the same form but have different, unrelated meanings
- Polysemy
- Homonyms
- Synonymy
- Antonymy
Question 171 :
One of the important factors for accurate machine translation is
- N-grams
- Resolving sense ambiguity
- Testing data
- Human translators
Question 172 :
In Reference Resolution the entity that is referred is called as _________ .
- corefer
- referent
- anaphora
- subject
Question 173 :
A ___________ is a word that resembles a preposition or an adverb, and that often combines with a verb to form a larger unit called a phrasal verb
- Preposition
- Determiners
- Particle
- Adjectives
Question 174 :
Porter Stemmer algorithm use for _______.
- Lemmatization
- Syntax Analysis
- Stemming
- Part of speech tagging
Question 175 :
The bigram model approximates the probability of a word given all the previous words by using:
- The conditional probability of all the previous words
- The maximum likelihood estimation of the given word
- Only the conditional probability of the preceding word
- The maximum likelihood estimation of the preceding word
Question 176 :
In NLP, The algorithm decreases the weight for commonly used words and increases the weight for words that are not used very much in a collection of documents
- Term Frequency (TF)
- Inverse Document Frequency (IDF)
- Word2Vec
- Latent Dirichlet Allocation (LDA)
Question 177 :
Which of the following is a example of irregular noun form?
- Fox
- Dog
- Mouse
- Cat
Question 178 :
Which Of The Following Architecture Can Be Trained Faster And Needs Less Amount Of Training Data
- Lstm Based Language Modelling
- Transformer Architecture
- Word Sense Disambiguation
- N-Grams
Question 179 :
_________________is mapping sentence plan into sentence structure.
- Text planning
- Sentence Planning
- Text Realisation
- Text Mapping
Question 180 :
_______ Is used to decode the optimal tag sequence
- Early algorithm
- Viterbi algorithm
- Lexk algorithm
- A centering algorithm