Natural Language Processing — Prelims Questions
Which of the following is NOT a core task or technique within Natural Language Processing (NLP)?
Consider the following statements regarding Transformer models in Natural Language Processing: 1. They rely primarily on Recurrent Neural Networks (RNNs) for sequential processing. 2. They utilize an 'attention mechanism' to weigh the importance of different words in a sequence. 3. BERT and GPT are prominent examples of models based on the Transformer architecture. Which of the statements given above is/are correct?
Which of the following Indian government initiatives or platforms is explicitly designed to leverage Natural Language Processing for multilingual support and digital inclusion?
Regarding the ethical implications of Natural Language Processing, which of the following is a significant concern?
Which of the following best describes the function of 'Word Embeddings' in Natural Language Processing?