SKILL BENCHMARK

NLP with Deep Learning Proficiency (Advanced Level)

  • 18m
  • 18 questions
The NLP with Deep Learning Proficiency (Advanced Level) benchmark measures your knowledge of out-of-the-box transformer models for Natural Language Processing (NLP). You will be evaluated on your ability to use attention-based models and transformers for NLP. A learner who scores high on this benchmark demonstrates that they have good experience in developing NLP applications using deep learning models, including transformer architectures, and can work on projects with minimal supervision.

Topics covered

  • calculate query, key, and value for transformer models
  • calculate the loss and accuracy for a translation model
  • classify sentences with a BERT model
  • compute text similarity with the USE
  • describe attention-based models and transformers
  • describe how multi-head attention works
  • generate translations using an attention-based model
  • perform subword tokenization with WordPiece
  • preprocess data for a transformer model
  • provide an overview of transformer models for language processing
  • set up a Bidirectional Encoder Representations from Transformers (BERT) model for text similarity
  • set up a decoder model with attention
  • set up an encoder-decoder model
  • set up the encoder and decoder
  • train and generate predictions using an encoder-decoder model
  • train a transformer model
  • train the FNet model for sentiment analysis
  • use pre-trained embeddings from the TensorFlow Hub