Total Questions : 20
Expected Time : 20 Minutes

1. How does 'context window' influence the performance of word embeddings in NLP?

2. Explain the concept of word sense disambiguation in Natural Language Processing and provide an example scenario where it is crucial.

3. Explain the concept of 'attention mechanism' in NLP and its role in sequence-to-sequence models.

4. Which technique is commonly used for sentiment analysis in NLP?

5. Discuss the role of 'bidirectional LSTM' in NLP and its advantages over traditional LSTM.

6. What is the significance of the term 'TF-IDF' in document representation, and how does it contribute to NLP tasks?

7. What is the primary purpose of a confusion matrix in NLP evaluation?

8. What is the purpose of cross-validation in NLP model training?

9. In named entity recognition, what does the 'LOC' tag represent?

10. What role does 'TF-IDF (Term Frequency-Inverse Document Frequency)' play in text analysis, and how is it calculated?

11. What is the role of a stop word in text processing?

12. In machine translation, what does the acronym BLEU stand for?

13. Discuss the significance of 'part-of-speech tagging' in NLP and its applications.

14. Discuss the significance of 'Named Entity Recognition (NER)' in NLP and its real-world applications.

15. What is 'word sense disambiguation' in NLP, and why is it important?

16. Which technique is commonly used for topic modeling in NLP?

17. What is the 'long-tail distribution' in the context of language processing?

18. Examine the impact of imbalanced datasets on the performance of Natural Language Processing models. Propose strategies to address this issue.

19. What is the purpose of an attention mechanism in NLP models?

20. Define 'lemmatization' and explain its significance in linguistic analysis.