SKILL BENCHMARK
Natural Language Processing Mastery (Expert Level)
- 23m
- 23 questions
The Natural Language Processing Mastery (Advanced Level) benchmark measures your experience level in using natural language processing (NLP) advanced techniques, such as transformer models, BERT, GPT, and more, to build advanced NLP applications. A learner who scores high on this benchmark demonstrates that they have mastery in developing NLP applications.
Topics covered
- define few-shot learning as used in GPT-3
- define the Hugging Face library and outline its benefits
- demonstrating English to French machine translation - assemble and train model end to end
- demonstrating English to French machine translation - define attention and positional embedding for input sequence
- demonstrating English to French machine translation - define training and validation datasets with input, target of specified batch size, and sequence length
- demonstrating English to French machine translation - perform exploratory data analysis (EDA) and data pre-processing
- demonstrating English to French machine translation - produce a new representation of input sequence using Transformer encoder
- demonstrating English to French machine translation - produce an output target sequence using Transformer decoder
- demonstrating text generation with GPT - perform greedy and beam searches and use basic sampling
- demonstrating text generation with GPT - perform Top K and Top P sampling
- demonstrating text generation with GPT - use benchmark prompts to perform model generations given interesting inputs
- identify the encoder block in the Transformer architecture
- identify the fundamentals of the Transformer architecture
- outline and apply sequence-to-sequence (Seq2Seq) encoder-decoder network architecture
- outline key features of GPT models
- perform data pre-processing and BERT tokenization
- perform library, model setup, and data exploration
- perform model evaluation and predicting sentiment for raw text
- perform sentiment classification training with BERT and Hugging Face
- recall how to use the attention method to improve NLP model results
- recognize BERT Transfer Learning approaches
- state the challenges for Transformer models
- state the fundamentals of multi-head attention in the Transformer architecture