Final Exam: Natural Language Processing

Natural Language Processing    |    Intermediate
  • 1 video | 32s
  • Includes Assessment
Rating 4.8 of 5 users Rating 4.8 of 5 users (5)
Final Exam: Natural Language Processing will test your knowledge and application of the topics presented throughout the Skillsoft Aspire Natural Language Processing Journey.

WHAT YOU WILL LEARN

  • Illustrate phonomes, morpheme and lexemes
    describe syntactic and semantic analysis for nlp
    describe heuristic approaches to solve nlp tasks
    specify challenges with nlp in real world problem solving
    illustrate various tools in nlp used across different industries
    explain basic overview of nltk ecosystem
    classify the difference between spacy and nltk
    explain overview of spacy models and language support
    perform spacy installations and models, part of speech, morphology, lemmatization; demonstrate dependency parsing, named entities, entity linking, tokenization, merging & splitting; demonstrate sentence segmentation, sentence similarity,
    illustrate and extract synonyms and identify wordnet hierarchies - hypernyms and hyponyms
    demonstrate python re module, re - search, find all, finditer, groups, find and replace, split
    recall anchors, character classes, greedy, lazy and backtracking, performance
    identify anchors, character classes, greedy, lazy and backtracking, performance
    perform exploration of meronyms and holonymms
    illustrate one-hot-encoding, bag of words, n-gram, tfidf
    demonstrate data loading and basic overview of columns
    demonstrate simple model building & evaluation
    restate logistic regression, svm, naive bayes & boosting models
    explain the existence of polyglot & textblob and describe the benefits over nltk & spacy, use-cases
    explain the existence of gensim
    demonstrate query similarity using gensim
    demonstrate building an lda model for topic modeling using genism
    explain nlp with deep learning;
    illustrate various use cases in nlp across different industries
    explain the basic overview of spacy, tensorflow ecosystem
    illustrate overview of data
    illustrate the basics of data loading and columns
    illustrate single layer perceptron architecture of neural network; illustrate multi layer perceptron architecture of neural network
    describe recurrent neural network architecture and how it can capture context in language
    illustrate different application of basic neural network based architecture
  • describe the steps to load the amazon product reviews dataset into google colaboratory
    explain the importance of memory based learning and different networks it supports
    perform review classification using gru
    describe gated recurrent unit and it's difference from rnn
    illustrate different types of lstm networks
    explain what is transfer learning and how it help's to get better results, explore different types of learnings and different types of transfer learning; specify advantages & challenges of transfer learning in real world problem solving
    describe and explore fasttext & word2vec
    implement and fine tune the lm model using ulmfit
    describe and explore elmo
    describe and explore fasttext & word2vec
    explain github bug data and problem statement; perform library loading, data loading and basic overview of columns
    demonstrate exploratory data analysis (eda) - word count analysis and label analysis
    explain github bug data and problem statement
    demonstrate eda - punctuation analysis, stop word analysis, and word cloud
    demonstrate fundamentals of transformer architecture
    discover the use of the attention method to improve nlp model results
    explain the encoder-decoder network architecture and its application
    demonstrate fundamentals of the self-attention layer in transformer architecture
    explore bert transfer learning approaches
    perform model evaluation and predicting sentiment for raw text
    discover different types of bert models and their significance
    perform data pre-processing and bert tokenization
    discover fundamentals of language models, explain basics of gpt models
    describe few shot learning concept used in gpt 3
    discover different industry use cases and challenges for gpt
    greedy search, beam search, and basic sampling
    explore machine translation problem and its application
    transformer encoder to produce a new representation of input sequence
    frame the problem end to end. understand how single sentence translation works
    transformer decoder to produce output target sequence

YOU MIGHT ALSO LIKE

Rating 4.5 of 27 users Rating 4.5 of 27 users (27)
Rating 5.0 of 2 users Rating 5.0 of 2 users (2)
Rating 3.6 of 13 users Rating 3.6 of 13 users (13)

PEOPLE WHO VIEWED THIS ALSO VIEWED THESE

Rating 4.4 of 27 users Rating 4.4 of 27 users (27)
Rating 4.2 of 41 users Rating 4.2 of 41 users (41)
Rating 4.0 of 2 users Rating 4.0 of 2 users (2)