Question 751 :
Cohesion Bounds Text Together. Consider The Following Piece Of Text Yesterday, My Friend Invited Me To Her House. When I Reached, My Friend Was Preparing Coffee. Her Father Was Cleaning Dishes. Her Mother Was Busy Writing A Book. Each Occurance In The Above Text Refers To Which Noun Phrase?
- Me
- Friend'S Father
- Friend'S Mother
- My Friend'S
Question 752 :
Nlp Is Concerned With The Interactions Between
- Computers And Human (Natural) Languages.
- Machine And Machine
- Human And Machine
- Both A) And B)
Question 753 :
Rule-based POS taggers doesnt possess which of the following properties
- The rules in Rule-based POS tagging are built auto
- These taggers are knowledge-driven taggers
- These taggers are consist of many hand written rules
- The information is coded in the form of rules.
Question 754 :
The words that pronouns refer back to are called as __________.
- Antecedent
- Context
- Reference
- Speech act
Question 755 :
The reference to an entity that has been previously introduced into the sentence is called as __________ .
- discourse
- anaphora
- co refer
- referent
Question 756 :
In the sentence, 'He ate the pizza', the BOLD part is an example of _____.
- Noun phrase
- Verb phrase
- Prepositional phrase
- Adverbial phrase
Question 757 :
which one of the following is not Tools/Techniques that can be used with sentiment analysis
- SentiWordNet
- Latent semantic analysis
- Latent semantic analysis
- Abstractive analysis
Question 758 :
Which class words are limited in number
- Open class
- Closed class
- Tree bank
- Dictionary
Question 759 :
Main reason for tokenization
- It is simplest process
- Processing on word can be easily performed
- Almost all algorithms of tokenization executes in polynomial time
- Readymade program are available in various programming language
Question 760 :
Which type of ambiguity is present in the sentence Old men and women were taken to safe locations?
- Attachment ambiguity
- Scope Ambiguity
- Discourse ambiguity
- Semantics Ambiguity
Question 761 :
Suppose we want to calculate a probability for the sequence of observations {‘Dry’,’Rain’}. If the following are the possible hidden state sequences, then P(‘Dry’,‘Rain’) = ______. Transition probabilities: P(‘Low’|‘Low’)=0.3 , P(‘High’|‘Low’)=0.7 P(‘Low’|‘High’)=0.2, P(‘High’|‘High’)=0.8 • Observation probabilities : P(‘Rain’|‘Low’)=0.6 , P(‘Dry’|‘Low’)=0.4 P(‘Rain’|‘High’)=0.4 , P(‘Dry’|‘High’)=0.3 • Initial probabilities: P(‘Low’)=0.4 , P(‘High’)=0.6
- 0.1748
- 0.2004
- 0.1208
- 0.2438
Question 762 :
Beverage = coffee - tea - shake, is example of ______
- Meronymy
- Hyponymy
- Polysemy
- Clines
Question 763 :
Given a sentence S=w1 w2 w3 ... wn, to compute the likelihood of S using a bigram model. How would you compute the likelihood of S?
- Calculate the conditional probability of each word in the sentence given the preceding word and add the resulting numbers
- Calculate the conditional probability of each word in the sentence given the preceding word and multiply the resulting numbers
- Calculate the conditional probability of each word given all preceding words in a sentence and add the resulting numbers
- Calculate the conditional probability of each word given all preceding words in a sentence and multiply the resulting numbers
Question 764 :
CFG consist of
- Set of rules
- Set of productions
- Order of elements
- rules ,productions,order of element
Question 765 :
What is the right order for a text classification model components 1. Text cleaning 2. Text annotation 3. Gradient descent 4. Model tuning 5. Text to predictors
- 12345
- 13425
- 12534
- 13452
Question 766 :
Which of the following techniques is most appropriate to get root of word without considering word syntax
- Stemming
- Lemmatization
- Stop word removal
- Rooting
Question 767 :
How many lexemes are there in following list.man,men,girls,girl,mouse
- 4
- 5
- 3
- 2
Question 768 :
Which is the most suitable tecnhiqe for finding Trendning Topic on Twitter?
- Term Frequncy
- NER
- Tokenization
- Segmentation
Question 769 :
Which is not types of antonyms
- Polar antonyms
- Equipollent antonyms
- Overlapping antonyms
- Unipolar antonyms
Question 770 :
Classifying email as a spam, labelling WebPages based on their content, voice recognition are the example of _____.
- Supervised learning
- Unsupervised learning
- Machine learning
- Deep learning